sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
3a55dd1ee9f3bf5254cdf62280c52e43b2ef7d06 |
# Gnosis
This dataset was provided by jeiku | Epiculous/Gnosis | [
"language:en",
"license:agpl-3.0",
"region:us"
] | 2024-01-21T21:53:16+00:00 | {"language": ["en"], "license": "agpl-3.0"} | 2024-02-04T17:31:35+00:00 | [] | [
"en"
] | TAGS
#language-English #license-agpl-3.0 #region-us
|
# Gnosis
This dataset was provided by jeiku | [
"# Gnosis\n\nThis dataset was provided by jeiku"
] | [
"TAGS\n#language-English #license-agpl-3.0 #region-us \n",
"# Gnosis\n\nThis dataset was provided by jeiku"
] |
67b329efa1185ec85ba0bc07301302999450bce8 |
# Dataset of sarya (Granblue Fantasy)
This is the dataset of sarya (Granblue Fantasy), containing 47 images and their tags.
The core tags of this character are `long_hair, horns, pointy_ears, breasts, glasses, ponytail, large_breasts, blonde_hair, green_eyes, ribbon, hair_ribbon, bow, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 47 | 36.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarya_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 47 | 27.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarya_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 100 | 55.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarya_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 47 | 34.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarya_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 100 | 64.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarya_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sarya_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 47 |  |  |  |  |  | 1girl, draph, solo, looking_at_viewer, blush, smile, white_gloves, simple_background, necktie, short_sleeves, white_background, open_mouth, plaid_skirt, shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | solo | looking_at_viewer | blush | smile | white_gloves | simple_background | necktie | short_sleeves | white_background | open_mouth | plaid_skirt | shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:--------|:--------|:---------------|:--------------------|:----------|:----------------|:-------------------|:-------------|:--------------|:--------|
| 0 | 47 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sarya_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T22:03:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T22:12:22+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of sarya (Granblue Fantasy)
===================================
This is the dataset of sarya (Granblue Fantasy), containing 47 images and their tags.
The core tags of this character are 'long\_hair, horns, pointy\_ears, breasts, glasses, ponytail, large\_breasts, blonde\_hair, green\_eyes, ribbon, hair\_ribbon, bow, brown\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
98211953a6d65499ef8b2aa6ca745c860987ea90 |
# Dataset of erin (Granblue Fantasy)
This is the dataset of erin (Granblue Fantasy), containing 25 images and their tags.
The core tags of this character are `long_hair, pointy_ears, blue_eyes, hair_ornament, bangs, blue_hair, breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 32.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/erin_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 22.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/erin_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 57 | 42.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/erin_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 30.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/erin_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 57 | 55.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/erin_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/erin_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, boots, crystal, looking_at_viewer, solo, detached_sleeves, sitting, bare_shoulders, black_footwear, blue_thighhighs, blush, ice, knees_up, open_mouth, see-through, simple_background, sleeveless_dress, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | boots | crystal | looking_at_viewer | solo | detached_sleeves | sitting | bare_shoulders | black_footwear | blue_thighhighs | blush | ice | knees_up | open_mouth | see-through | simple_background | sleeveless_dress | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------|:--------------------|:-------|:-------------------|:----------|:-----------------|:-----------------|:------------------|:--------|:------|:-----------|:-------------|:--------------|:--------------------|:-------------------|:-------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/erin_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T22:03:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T22:08:15+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of erin (Granblue Fantasy)
==================================
This is the dataset of erin (Granblue Fantasy), containing 25 images and their tags.
The core tags of this character are 'long\_hair, pointy\_ears, blue\_eyes, hair\_ornament, bangs, blue\_hair, breasts, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
27869811a6af04569e955665365c52f7dcaf3528 | # lilac/lmsys-chat-1m
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/lmsys/lmsys-chat-1m](https://huggingface.co/datasets/lmsys/lmsys-chat-1m)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-lmsys-chat-1m
```
or from python with:
```py
ll.download("lilacai/lilac-lmsys-chat-1m")
```
| lilacai/lilac-lmsys-chat-1m | [
"Lilac",
"region:us"
] | 2024-01-21T22:11:24+00:00 | {"tags": ["Lilac"]} | 2024-01-29T16:09:29+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/lmsys-chat-1m
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/lmsys-chat-1m\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/lmsys-chat-1m\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
42c6d7830d5c28c41905cca51994f54891418a59 |
# Dataset Card for Evaluation run of chargoddard/internlm2-base-7b-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chargoddard/internlm2-base-7b-llama](https://huggingface.co/chargoddard/internlm2-base-7b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T22:11:28.111983](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama/blob/main/results_2024-01-21T22-11-28.111983.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5380339332515804,
"acc_stderr": 0.03359422386201474,
"acc_norm": 0.5448214536703925,
"acc_norm_stderr": 0.03431835769902873,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834569,
"mc2": 0.43232098792021034,
"mc2_stderr": 0.014402330839994766
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.014602878388536593,
"acc_norm": 0.5435153583617748,
"acc_norm_stderr": 0.01455594976049644
},
"harness|hellaswag|10": {
"acc": 0.59061939852619,
"acc_stderr": 0.004907146229347549,
"acc_norm": 0.7946624178450508,
"acc_norm_stderr": 0.004031225342516808
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196156,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196156
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278233,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278233
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.0333276906841079,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.0333276906841079
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766118,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766118
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501628,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501628
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978815,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978815
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652258,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652258
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7305236270753512,
"acc_stderr": 0.015866243073215054,
"acc_norm": 0.7305236270753512,
"acc_norm_stderr": 0.015866243073215054
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.026636539741116082,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.026636539741116082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.02824513402438729,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.02824513402438729
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581993,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581993
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4152542372881356,
"acc_stderr": 0.012585471793400662,
"acc_norm": 0.4152542372881356,
"acc_norm_stderr": 0.012585471793400662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.02993534270787774,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.02993534270787774
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.019997973035458333,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.019997973035458333
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087555,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087555
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834569,
"mc2": 0.43232098792021034,
"mc2_stderr": 0.014402330839994766
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.012696531870038611
},
"harness|gsm8k|5": {
"acc": 0.19181197877179681,
"acc_stderr": 0.010845169955294016
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama | [
"region:us"
] | 2024-01-21T22:13:35+00:00 | {"pretty_name": "Evaluation run of chargoddard/internlm2-base-7b-llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/internlm2-base-7b-llama](https://huggingface.co/chargoddard/internlm2-base-7b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T22:11:28.111983](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama/blob/main/results_2024-01-21T22-11-28.111983.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5380339332515804,\n \"acc_stderr\": 0.03359422386201474,\n \"acc_norm\": 0.5448214536703925,\n \"acc_norm_stderr\": 0.03431835769902873,\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834569,\n \"mc2\": 0.43232098792021034,\n \"mc2_stderr\": 0.014402330839994766\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536593,\n \"acc_norm\": 0.5435153583617748,\n \"acc_norm_stderr\": 0.01455594976049644\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.59061939852619,\n \"acc_stderr\": 0.004907146229347549,\n \"acc_norm\": 0.7946624178450508,\n \"acc_norm_stderr\": 0.004031225342516808\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196156,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196156\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n \"acc_stderr\": 0.027575960723278233,\n \"acc_norm\": 0.6225806451612903,\n \"acc_norm_stderr\": 0.027575960723278233\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766118,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766118\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.04732332615978815,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.04732332615978815\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.025598193686652258,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.025598193686652258\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7305236270753512,\n \"acc_stderr\": 0.015866243073215054,\n \"acc_norm\": 0.7305236270753512,\n \"acc_norm_stderr\": 0.015866243073215054\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.026636539741116082,\n \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.026636539741116082\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.02824513402438729,\n \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.02824513402438729\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n \"acc_stderr\": 0.027604689028581993,\n \"acc_norm\": 0.617363344051447,\n \"acc_norm_stderr\": 0.027604689028581993\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.026624152478845853,\n \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.026624152478845853\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n \"acc_stderr\": 0.012585471793400662,\n \"acc_norm\": 0.4152542372881356,\n \"acc_norm_stderr\": 0.012585471793400662\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.02993534270787774,\n \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.02993534270787774\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.019997973035458333,\n \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.019997973035458333\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087555,\n \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087555\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834569,\n \"mc2\": 0.43232098792021034,\n \"mc2_stderr\": 0.014402330839994766\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038611\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19181197877179681,\n \"acc_stderr\": 0.010845169955294016\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/internlm2-base-7b-llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|arc:challenge|25_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|gsm8k|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hellaswag|10_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T22-11-28.111983.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["**/details_harness|winogrande|5_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T22-11-28.111983.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T22_11_28.111983", "path": ["results_2024-01-21T22-11-28.111983.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T22-11-28.111983.parquet"]}]}]} | 2024-01-21T22:13:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chargoddard/internlm2-base-7b-llama
Dataset automatically created during the evaluation run of model chargoddard/internlm2-base-7b-llama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T22:11:28.111983(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of chargoddard/internlm2-base-7b-llama\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/internlm2-base-7b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T22:11:28.111983(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chargoddard/internlm2-base-7b-llama\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/internlm2-base-7b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T22:11:28.111983(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1fc3cf7b38e57b1207edd395f4d9edab58e3c68a | This dataset is based on the Japanese version of Wikipedia dataset and converted into a multi-turn conversation format using llama2Pro8B.
Since it is a llama2 license, it can be used commercially for services.
Some strange dialogue may be included as it has not been screened by humans.
We generated over 80,000 conversations 22 days on an A100 80GBx7 machine and automatically screened them.
# Model
https://huggingface.co/spaces/TencentARC/LLaMA-Pro-8B-Instruct-Chat
# Dataset
https://huggingface.co/datasets/izumi-lab/wikipedia-ja-20230720
# Compute by
Tsuginosuke AI SuperComputer
FreeAI Ltd.
https://free-ai.ltd | shi3z/ja_conv_wikipedia_llama2pro8b_30k | [
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:ja",
"license:llama2",
"region:us"
] | 2024-01-21T22:14:41+00:00 | {"language": ["ja"], "license": "llama2", "size_categories": ["10K<n<100K"], "task_categories": ["conversational"]} | 2024-01-21T22:16:01+00:00 | [] | [
"ja"
] | TAGS
#task_categories-conversational #size_categories-10K<n<100K #language-Japanese #license-llama2 #region-us
| This dataset is based on the Japanese version of Wikipedia dataset and converted into a multi-turn conversation format using llama2Pro8B.
Since it is a llama2 license, it can be used commercially for services.
Some strange dialogue may be included as it has not been screened by humans.
We generated over 80,000 conversations 22 days on an A100 80GBx7 machine and automatically screened them.
# Model
URL
# Dataset
URL
# Compute by
Tsuginosuke AI SuperComputer
FreeAI Ltd.
URL | [
"# Model\nURL",
"# Dataset\nURL",
"# Compute by\nTsuginosuke AI SuperComputer\nFreeAI Ltd.\n\nURL"
] | [
"TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-Japanese #license-llama2 #region-us \n",
"# Model\nURL",
"# Dataset\nURL",
"# Compute by\nTsuginosuke AI SuperComputer\nFreeAI Ltd.\n\nURL"
] |
7b08bf5cbbadde21cccc099f584441ae654e9b47 |
# Basic Math 1M
A dataset of 1 million basic arithmetic problems with potential user prompts. See [the numerical version](https://huggingface.co/datasets/lmlab/basic-math-1m-numerical) for a version with only numbers.
## License
Basic Math 1M is dual-licensed under the GNU GPL license and the CC-BY-SA 4.0 license, you may choose either at your choice. If you are interested in including this dataset in another differently-licensed dataset, please contact me.
## Credit
Basic Math 1M was inspired by [Simple Math](https://huggingface.co/datasets/fblgit/simple-math) but was created independently. | lmlab/basic-math-1m | [
"task_categories:text-generation",
"task_categories:text2text-generation",
"size_categories:1M<n<10M",
"language:en",
"license:cc-by-sa-4.0",
"license:gpl",
"math",
"region:us"
] | 2024-01-21T22:20:56+00:00 | {"language": ["en"], "license": ["cc-by-sa-4.0", "gpl"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "text2text-generation"], "pretty_name": "Basic Math 1M", "tags": ["math"]} | 2024-01-22T21:56:33+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-text2text-generation #size_categories-1M<n<10M #language-English #license-cc-by-sa-4.0 #license-gpl #math #region-us
|
# Basic Math 1M
A dataset of 1 million basic arithmetic problems with potential user prompts. See the numerical version for a version with only numbers.
## License
Basic Math 1M is dual-licensed under the GNU GPL license and the CC-BY-SA 4.0 license, you may choose either at your choice. If you are interested in including this dataset in another differently-licensed dataset, please contact me.
## Credit
Basic Math 1M was inspired by Simple Math but was created independently. | [
"# Basic Math 1M\n\nA dataset of 1 million basic arithmetic problems with potential user prompts. See the numerical version for a version with only numbers.",
"## License\n\nBasic Math 1M is dual-licensed under the GNU GPL license and the CC-BY-SA 4.0 license, you may choose either at your choice. If you are interested in including this dataset in another differently-licensed dataset, please contact me.",
"## Credit\n\nBasic Math 1M was inspired by Simple Math but was created independently."
] | [
"TAGS\n#task_categories-text-generation #task_categories-text2text-generation #size_categories-1M<n<10M #language-English #license-cc-by-sa-4.0 #license-gpl #math #region-us \n",
"# Basic Math 1M\n\nA dataset of 1 million basic arithmetic problems with potential user prompts. See the numerical version for a version with only numbers.",
"## License\n\nBasic Math 1M is dual-licensed under the GNU GPL license and the CC-BY-SA 4.0 license, you may choose either at your choice. If you are interested in including this dataset in another differently-licensed dataset, please contact me.",
"## Credit\n\nBasic Math 1M was inspired by Simple Math but was created independently."
] |
06a1ee4bbf27eb7a27fb7e3efec3351cce1b2054 |
# Dataset of linaria (Granblue Fantasy)
This is the dataset of linaria (Granblue Fantasy), containing 23 images and their tags.
The core tags of this character are `bow, hair_bow, pink_hair, bangs, purple_hair, hair_bun, red_bow, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 20.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linaria_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 15.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linaria_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 26.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linaria_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 20.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linaria_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 33.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linaria_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/linaria_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, open_mouth, smile, blush, solo, looking_at_viewer, heart, puffy_sleeves, skirt, short_sleeves, simple_background, white_background, flower |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | smile | blush | solo | looking_at_viewer | heart | puffy_sleeves | skirt | short_sleeves | simple_background | white_background | flower |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------|:--------|:-------|:--------------------|:--------|:----------------|:--------|:----------------|:--------------------|:-------------------|:---------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/linaria_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T22:26:32+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T22:31:41+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of linaria (Granblue Fantasy)
=====================================
This is the dataset of linaria (Granblue Fantasy), containing 23 images and their tags.
The core tags of this character are 'bow, hair\_bow, pink\_hair, bangs, purple\_hair, hair\_bun, red\_bow, sidelocks', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
f6136500d45b69c45fd31b6ffe7c8e8565a9e6ad |
# Dataset of laguna (Granblue Fantasy)
This is the dataset of laguna (Granblue Fantasy), containing 37 images and their tags.
The core tags of this character are `blonde_hair, horns, pointy_ears, short_hair, breasts, hair_over_one_eye, large_breasts, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 37 | 30.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laguna_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 37 | 22.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laguna_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 76 | 41.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laguna_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 37 | 28.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laguna_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 76 | 49.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laguna_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/laguna_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, draph, solo, looking_at_viewer, simple_background, pantyhose, blue_necktie, weapon, white_background, holding, necktie_between_breasts |
| 1 | 13 |  |  |  |  |  | 1girl, blush, draph, hetero, solo_focus, 1boy, necktie, open_mouth, penis, covered_nipples, mosaic_censoring, paizuri, bare_shoulders, cum_on_body, gloves, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | solo | looking_at_viewer | simple_background | pantyhose | blue_necktie | weapon | white_background | holding | necktie_between_breasts | blush | hetero | solo_focus | 1boy | necktie | open_mouth | penis | covered_nipples | mosaic_censoring | paizuri | bare_shoulders | cum_on_body | gloves | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:--------------------|:------------|:---------------|:---------|:-------------------|:----------|:--------------------------|:--------|:---------|:-------------|:-------|:----------|:-------------|:--------|:------------------|:-------------------|:----------|:-----------------|:--------------|:---------|:--------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/laguna_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T22:26:44+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T22:34:21+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of laguna (Granblue Fantasy)
====================================
This is the dataset of laguna (Granblue Fantasy), containing 37 images and their tags.
The core tags of this character are 'blonde\_hair, horns, pointy\_ears, short\_hair, breasts, hair\_over\_one\_eye, large\_breasts, blue\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
73714bbab6f21677e74c1afba2673549f17cddbd |
# Basic Math 1M Numerical
A dataset of 1 million basic arithmetic problems with only numbers. See [the original version](https://huggingface.co/datasets/lmlab/basic-math-1m) for a version with potential prompts as well.
## License
Basic Math 1M Numerical is dual-licensed under the GNU GPL license and the CC-BY-SA 4.0 license, you may choose either at your choice. If you are interested in including this dataset in another differently-licensed dataset, please contact me.
## Credit
Basic Math 1M was inspired by [Simple Math](https://huggingface.co/datasets/fblgit/simple-math) but was created independently. | lmlab/basic-math-1m-numerical | [
"task_categories:text-generation",
"task_categories:text2text-generation",
"size_categories:1M<n<10M",
"language:en",
"license:cc-by-sa-4.0",
"license:gpl",
"math",
"region:us"
] | 2024-01-21T22:28:14+00:00 | {"language": ["en"], "license": ["cc-by-sa-4.0", "gpl"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "text2text-generation"], "pretty_name": "Basic Math 1M", "tags": ["math"]} | 2024-01-22T21:56:50+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-text2text-generation #size_categories-1M<n<10M #language-English #license-cc-by-sa-4.0 #license-gpl #math #region-us
|
# Basic Math 1M Numerical
A dataset of 1 million basic arithmetic problems with only numbers. See the original version for a version with potential prompts as well.
## License
Basic Math 1M Numerical is dual-licensed under the GNU GPL license and the CC-BY-SA 4.0 license, you may choose either at your choice. If you are interested in including this dataset in another differently-licensed dataset, please contact me.
## Credit
Basic Math 1M was inspired by Simple Math but was created independently. | [
"# Basic Math 1M Numerical\n\nA dataset of 1 million basic arithmetic problems with only numbers. See the original version for a version with potential prompts as well.",
"## License\n\nBasic Math 1M Numerical is dual-licensed under the GNU GPL license and the CC-BY-SA 4.0 license, you may choose either at your choice. If you are interested in including this dataset in another differently-licensed dataset, please contact me.",
"## Credit\n\nBasic Math 1M was inspired by Simple Math but was created independently."
] | [
"TAGS\n#task_categories-text-generation #task_categories-text2text-generation #size_categories-1M<n<10M #language-English #license-cc-by-sa-4.0 #license-gpl #math #region-us \n",
"# Basic Math 1M Numerical\n\nA dataset of 1 million basic arithmetic problems with only numbers. See the original version for a version with potential prompts as well.",
"## License\n\nBasic Math 1M Numerical is dual-licensed under the GNU GPL license and the CC-BY-SA 4.0 license, you may choose either at your choice. If you are interested in including this dataset in another differently-licensed dataset, please contact me.",
"## Credit\n\nBasic Math 1M was inspired by Simple Math but was created independently."
] |
88867a9c95313eb278501c1355ad600a3a5869c1 |
# Dataset of selfira (Granblue Fantasy)
This is the dataset of selfira (Granblue Fantasy), containing 11 images and their tags.
The core tags of this character are `animal_ears, red_hair, long_hair, bangs, breasts, ponytail, mole, medium_breasts, brown_eyes, mole_under_eye, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 9.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 6.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 13.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 8.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 15.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/selfira_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, erune, solo, looking_at_viewer, red_dress, bare_shoulders, simple_background, detached_sleeves, bare_back, cape, from_behind, looking_back, ass, backless_dress, blush, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | solo | looking_at_viewer | red_dress | bare_shoulders | simple_background | detached_sleeves | bare_back | cape | from_behind | looking_back | ass | backless_dress | blush | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:------------|:-----------------|:--------------------|:-------------------|:------------|:-------|:--------------|:---------------|:------|:-----------------|:--------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/selfira_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T23:02:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T23:04:10+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of selfira (Granblue Fantasy)
=====================================
This is the dataset of selfira (Granblue Fantasy), containing 11 images and their tags.
The core tags of this character are 'animal\_ears, red\_hair, long\_hair, bangs, breasts, ponytail, mole, medium\_breasts, brown\_eyes, mole\_under\_eye, brown\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
206387fb38f7f38f3d102a40be0987e7d547f7ee |
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-merge
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-forest-merge](https://huggingface.co/abhishekchohan/mistral-7B-forest-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T23:23:15.649063](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge/blob/main/results_2024-01-21T23-23-15.649063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6022067316089463,
"acc_stderr": 0.032877722301518426,
"acc_norm": 0.6045609403878123,
"acc_norm_stderr": 0.03353760382711908,
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5748469157653282,
"mc2_stderr": 0.015758784357589765
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938167,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.6519617606054571,
"acc_stderr": 0.004753746951620151,
"acc_norm": 0.8440549691296555,
"acc_norm_stderr": 0.0036206175507473956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302927,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.034711928605184676,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.034711928605184676
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846486,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098822,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098822
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934499,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934499
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654667,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654667
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.01611523550486548,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.01611523550486548
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.027363593284684972,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.027363593284684972
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153273,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153273
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.01268590653820624,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.01268590653820624
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.019751726508762637,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.019751726508762637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440307,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5748469157653282,
"mc2_stderr": 0.015758784357589765
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712664
},
"harness|gsm8k|5": {
"acc": 0.511751326762699,
"acc_stderr": 0.013768680408142806
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge | [
"region:us"
] | 2024-01-21T23:21:35+00:00 | {"pretty_name": "Evaluation run of abhishekchohan/mistral-7B-forest-merge", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-forest-merge](https://huggingface.co/abhishekchohan/mistral-7B-forest-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T23:23:15.649063](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge/blob/main/results_2024-01-21T23-23-15.649063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6022067316089463,\n \"acc_stderr\": 0.032877722301518426,\n \"acc_norm\": 0.6045609403878123,\n \"acc_norm_stderr\": 0.03353760382711908,\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5748469157653282,\n \"mc2_stderr\": 0.015758784357589765\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938167,\n \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6519617606054571,\n \"acc_stderr\": 0.004753746951620151,\n \"acc_norm\": 0.8440549691296555,\n \"acc_norm_stderr\": 0.0036206175507473956\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n \"acc_stderr\": 0.02564938106302927,\n \"acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.02564938106302927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.034711928605184676,\n \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.034711928605184676\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117467,\n \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117467\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846486,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846486\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098822,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098822\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n \"acc_stderr\": 0.014774358319934499,\n \"acc_norm\": 0.7816091954022989,\n \"acc_norm_stderr\": 0.014774358319934499\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654667,\n \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654667\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n \"acc_stderr\": 0.01611523550486548,\n \"acc_norm\": 0.3664804469273743,\n \"acc_norm_stderr\": 0.01611523550486548\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684972,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684972\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153273,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153273\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144366,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144366\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.01268590653820624,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.01268590653820624\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.019751726508762637,\n \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.019751726508762637\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440307,\n \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5748469157653282,\n \"mc2_stderr\": 0.015758784357589765\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712664\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.511751326762699,\n \"acc_stderr\": 0.013768680408142806\n }\n}\n```", "repo_url": "https://huggingface.co/abhishekchohan/mistral-7B-forest-merge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|arc:challenge|25_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|arc:challenge|25_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|gsm8k|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|gsm8k|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hellaswag|10_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hellaswag|10_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T23-19-15.004437.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T23-23-15.649063.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["**/details_harness|winogrande|5_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["**/details_harness|winogrande|5_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T23-23-15.649063.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T23_19_15.004437", "path": ["results_2024-01-21T23-19-15.004437.parquet"]}, {"split": "2024_01_21T23_23_15.649063", "path": ["results_2024-01-21T23-23-15.649063.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T23-23-15.649063.parquet"]}]}]} | 2024-01-21T23:26:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-merge
Dataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-forest-merge on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T23:23:15.649063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-merge\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-forest-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T23:23:15.649063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-merge\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-forest-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T23:23:15.649063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e5ab116e3f0827cdb98d437aa2706585ede20975 | # Dataset Card for "hub-report-raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Kevinger/hub-report-raw | [
"region:us"
] | 2024-01-21T23:38:32+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "score", "dtype": "float64"}, {"name": "label", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8565909, "num_examples": 3159}], "download_size": 0, "dataset_size": 8565909}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-22T22:02:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "hub-report-raw"
More Information needed | [
"# Dataset Card for \"hub-report-raw\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"hub-report-raw\"\n\nMore Information needed"
] |
473acca6a2964ed0c39e65e89436c15b5325e2ae |
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-med-merge
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-med-merge](https://huggingface.co/abhishekchohan/mistral-7B-med-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T23:55:53.444793](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge/blob/main/results_2024-01-21T23-55-53.444793.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5813577401888572,
"acc_stderr": 0.03392117164078901,
"acc_norm": 0.5837504497394033,
"acc_norm_stderr": 0.03462209771980794,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347976,
"mc2": 0.536535823977723,
"mc2_stderr": 0.015589392373408393
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257187,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094089
},
"harness|hellaswag|10": {
"acc": 0.6461860187213703,
"acc_stderr": 0.004771751187407019,
"acc_norm": 0.8296156144194383,
"acc_norm_stderr": 0.0037520176390837515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178816,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178816
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846486,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137503,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137503
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.03182231867647553,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.03182231867647553
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033543,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6756066411238825,
"acc_stderr": 0.016740929047162702,
"acc_norm": 0.6756066411238825,
"acc_norm_stderr": 0.016740929047162702
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968822,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968822
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388866,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647012,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.02691500301138016,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.02691500301138016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765844,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765844
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038245,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038245
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347976,
"mc2": 0.536535823977723,
"mc2_stderr": 0.015589392373408393
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090254
},
"harness|gsm8k|5": {
"acc": 0.4495830174374526,
"acc_stderr": 0.013702290047884738
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge | [
"region:us"
] | 2024-01-21T23:58:12+00:00 | {"pretty_name": "Evaluation run of abhishekchohan/mistral-7B-med-merge", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-med-merge](https://huggingface.co/abhishekchohan/mistral-7B-med-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T23:55:53.444793](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge/blob/main/results_2024-01-21T23-55-53.444793.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5813577401888572,\n \"acc_stderr\": 0.03392117164078901,\n \"acc_norm\": 0.5837504497394033,\n \"acc_norm_stderr\": 0.03462209771980794,\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.017058761501347976,\n \"mc2\": 0.536535823977723,\n \"mc2_stderr\": 0.015589392373408393\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257187,\n \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094089\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6461860187213703,\n \"acc_stderr\": 0.004771751187407019,\n \"acc_norm\": 0.8296156144194383,\n \"acc_norm_stderr\": 0.0037520176390837515\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846486,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846486\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.031811100324139245,\n \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.031811100324139245\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137503,\n \"acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137503\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7107843137254902,\n \"acc_stderr\": 0.03182231867647553,\n \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.03182231867647553\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.02514093595033543,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.02514093595033543\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n \"acc_stderr\": 0.016740929047162702,\n \"acc_norm\": 0.6756066411238825,\n \"acc_norm_stderr\": 0.016740929047162702\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968822,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968822\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388866,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388866\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n \"acc_stderr\": 0.027731258647012,\n \"acc_norm\": 0.6077170418006431,\n \"acc_norm_stderr\": 0.027731258647012\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n \"acc_stderr\": 0.012635799922765844,\n \"acc_norm\": 0.4276401564537158,\n \"acc_norm_stderr\": 0.012635799922765844\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.034678266857038245,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.034678266857038245\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.017058761501347976,\n \"mc2\": 0.536535823977723,\n \"mc2_stderr\": 0.015589392373408393\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090254\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4495830174374526,\n \"acc_stderr\": 0.013702290047884738\n }\n}\n```", "repo_url": "https://huggingface.co/abhishekchohan/mistral-7B-med-merge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|arc:challenge|25_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|gsm8k|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hellaswag|10_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T23-55-53.444793.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["**/details_harness|winogrande|5_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T23-55-53.444793.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T23_55_53.444793", "path": ["results_2024-01-21T23-55-53.444793.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T23-55-53.444793.parquet"]}]}]} | 2024-01-21T23:58:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-med-merge
Dataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-med-merge on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T23:55:53.444793(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-med-merge\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-med-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T23:55:53.444793(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-med-merge\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-med-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T23:55:53.444793(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8b69311c27202b58c750ae277c6aeb95b8ed8774 |
# 한국어 모델 캘리브레이션용 데이터셋
허깅페이스에 올라와 있는 다양한 한국어 데이터셋이 사용되었습니다. | maywell/ko-calibration | [
"region:us"
] | 2024-01-22T00:43:23+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 55843115, "num_examples": 38772}], "download_size": 31384444, "dataset_size": 55843115}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-22T02:39:20+00:00 | [] | [] | TAGS
#region-us
|
# 한국어 모델 캘리브레이션용 데이터셋
허깅페이스에 올라와 있는 다양한 한국어 데이터셋이 사용되었습니다. | [
"# 한국어 모델 캘리브레이션용 데이터셋\n\n허깅페이스에 올라와 있는 다양한 한국어 데이터셋이 사용되었습니다."
] | [
"TAGS\n#region-us \n",
"# 한국어 모델 캘리브레이션용 데이터셋\n\n허깅페이스에 올라와 있는 다양한 한국어 데이터셋이 사용되었습니다."
] |
3a1c15d2d41d5b1efd95e21f288ccef1ac5a379e |
# Dataset Card for Evaluation run of lodrick-the-lafted/Grafted-Llama2-2x70B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lodrick-the-lafted/Grafted-Llama2-2x70B](https://huggingface.co/lodrick-the-lafted/Grafted-Llama2-2x70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Llama2-2x70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T00:41:15.048922](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Llama2-2x70B/blob/main/results_2024-01-22T00-41-15.048922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7162400942995992,
"acc_stderr": 0.029857485504267534,
"acc_norm": 0.7199041020126216,
"acc_norm_stderr": 0.030434480280872887,
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6649277907845683,
"mc2_stderr": 0.014471158700072567
},
"harness|arc:challenge|25": {
"acc": 0.6928327645051194,
"acc_stderr": 0.013481034054980945,
"acc_norm": 0.7261092150170648,
"acc_norm_stderr": 0.013032004972989505
},
"harness|hellaswag|10": {
"acc": 0.7223660625373431,
"acc_stderr": 0.004469165728600332,
"acc_norm": 0.8957379008165705,
"acc_norm_stderr": 0.003049756910828993
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7433962264150943,
"acc_stderr": 0.026880647889051982,
"acc_norm": 0.7433962264150943,
"acc_norm_stderr": 0.026880647889051982
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.03435568056047875,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.03435568056047875
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7148936170212766,
"acc_stderr": 0.02951319662553935,
"acc_norm": 0.7148936170212766,
"acc_norm_stderr": 0.02951319662553935
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.02573833063941215,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.02573833063941215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8354838709677419,
"acc_stderr": 0.021090847745939303,
"acc_norm": 0.8354838709677419,
"acc_norm_stderr": 0.021090847745939303
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7230769230769231,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.7230769230769231,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7857142857142857,
"acc_stderr": 0.026653531596715487,
"acc_norm": 0.7857142857142857,
"acc_norm_stderr": 0.026653531596715487
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.01280978008187893,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.01280978008187893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073315,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956715,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956715
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.025998379092356517,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.025998379092356517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.03154521672005472,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.03154521672005472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869621,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869621
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709225,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709225
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.876117496807152,
"acc_stderr": 0.011781017100950739,
"acc_norm": 0.876117496807152,
"acc_norm_stderr": 0.011781017100950739
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.02090397584208303,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.02090397584208303
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6312849162011173,
"acc_stderr": 0.016135759015030126,
"acc_norm": 0.6312849162011173,
"acc_norm_stderr": 0.016135759015030126
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7781350482315113,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.7781350482315113,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.019935086092149893,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.019935086092149893
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.02927553215970472,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.02927553215970472
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5808344198174706,
"acc_stderr": 0.012602244505788219,
"acc_norm": 0.5808344198174706,
"acc_norm_stderr": 0.012602244505788219
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.01703522925803403,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.01703522925803403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.02540930195322568,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.02540930195322568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700637,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700637
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276915,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276915
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6649277907845683,
"mc2_stderr": 0.014471158700072567
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873499
},
"harness|gsm8k|5": {
"acc": 0.579226686884003,
"acc_stderr": 0.013598489497182838
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Llama2-2x70B | [
"region:us"
] | 2024-01-22T00:43:36+00:00 | {"pretty_name": "Evaluation run of lodrick-the-lafted/Grafted-Llama2-2x70B", "dataset_summary": "Dataset automatically created during the evaluation run of model [lodrick-the-lafted/Grafted-Llama2-2x70B](https://huggingface.co/lodrick-the-lafted/Grafted-Llama2-2x70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Llama2-2x70B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T00:41:15.048922](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Llama2-2x70B/blob/main/results_2024-01-22T00-41-15.048922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7162400942995992,\n \"acc_stderr\": 0.029857485504267534,\n \"acc_norm\": 0.7199041020126216,\n \"acc_norm_stderr\": 0.030434480280872887,\n \"mc1\": 0.4920440636474908,\n \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6649277907845683,\n \"mc2_stderr\": 0.014471158700072567\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6928327645051194,\n \"acc_stderr\": 0.013481034054980945,\n \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989505\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7223660625373431,\n \"acc_stderr\": 0.004469165728600332,\n \"acc_norm\": 0.8957379008165705,\n \"acc_norm_stderr\": 0.003049756910828993\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7433962264150943,\n \"acc_stderr\": 0.026880647889051982,\n \"acc_norm\": 0.7433962264150943,\n \"acc_norm_stderr\": 0.026880647889051982\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.03435568056047875,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.03435568056047875\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7148936170212766,\n \"acc_stderr\": 0.02951319662553935,\n \"acc_norm\": 0.7148936170212766,\n \"acc_norm_stderr\": 0.02951319662553935\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.02573833063941215,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.02573833063941215\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8354838709677419,\n \"acc_stderr\": 0.021090847745939303,\n \"acc_norm\": 0.8354838709677419,\n \"acc_norm_stderr\": 0.021090847745939303\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7230769230769231,\n \"acc_stderr\": 0.022688042352424994,\n \"acc_norm\": 0.7230769230769231,\n \"acc_norm_stderr\": 0.022688042352424994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7857142857142857,\n \"acc_stderr\": 0.026653531596715487,\n \"acc_norm\": 0.7857142857142857,\n \"acc_norm_stderr\": 0.026653531596715487\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956715,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956715\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n \"acc_stderr\": 0.025998379092356517,\n \"acc_norm\": 0.8161434977578476,\n \"acc_norm_stderr\": 0.025998379092356517\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.03154521672005472,\n \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.03154521672005472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.03343270062869621,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.03343270062869621\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709225,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709225\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.876117496807152,\n \"acc_stderr\": 0.011781017100950739,\n \"acc_norm\": 0.876117496807152,\n \"acc_norm_stderr\": 0.011781017100950739\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.02090397584208303,\n \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.02090397584208303\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6312849162011173,\n \"acc_stderr\": 0.016135759015030126,\n \"acc_norm\": 0.6312849162011173,\n \"acc_norm_stderr\": 0.016135759015030126\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.02342037547829613,\n \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.02342037547829613\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.7781350482315113,\n \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.019935086092149893,\n \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.019935086092149893\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.02927553215970472,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.02927553215970472\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5808344198174706,\n \"acc_stderr\": 0.012602244505788219,\n \"acc_norm\": 0.5808344198174706,\n \"acc_norm_stderr\": 0.012602244505788219\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.01703522925803403,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.01703522925803403\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.02540930195322568,\n \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.02540930195322568\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276915,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276915\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4920440636474908,\n \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6649277907845683,\n \"mc2_stderr\": 0.014471158700072567\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873499\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.579226686884003,\n \"acc_stderr\": 0.013598489497182838\n }\n}\n```", "repo_url": "https://huggingface.co/lodrick-the-lafted/Grafted-Llama2-2x70B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|arc:challenge|25_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|gsm8k|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hellaswag|10_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T00-41-15.048922.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["**/details_harness|winogrande|5_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T00-41-15.048922.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T00_41_15.048922", "path": ["results_2024-01-22T00-41-15.048922.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T00-41-15.048922.parquet"]}]}]} | 2024-01-22T00:43:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of lodrick-the-lafted/Grafted-Llama2-2x70B
Dataset automatically created during the evaluation run of model lodrick-the-lafted/Grafted-Llama2-2x70B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T00:41:15.048922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of lodrick-the-lafted/Grafted-Llama2-2x70B\n\n\n\nDataset automatically created during the evaluation run of model lodrick-the-lafted/Grafted-Llama2-2x70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T00:41:15.048922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of lodrick-the-lafted/Grafted-Llama2-2x70B\n\n\n\nDataset automatically created during the evaluation run of model lodrick-the-lafted/Grafted-Llama2-2x70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T00:41:15.048922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2c97bcd6713edd075e1d88fa8401ca0b881c9464 |
# PMData Dataset
## About Dataset
Paper: <https://dl.acm.org/doi/10.1145/3339825.3394926>
In this dataset, we present the PMData dataset that aims to combine traditional lifelogging with sports activity logging. Such a dataset enables the development of several interesting analysis applications, e.g., where additional sports data can be used to predict and analyze everyday developments like a person's weight and sleep patterns, and where traditional lifelog data can be used in a sports context to predict an athletes performance. In this respect, we have used the Fitbit Versa 2 smartwatch wristband, the PMSys sports logging app a and Google forms for the data collection, and PMData contains logging data for 5 months from 16 persons. Our initial experiments show that such analyzes are possible, but there are still large rooms for improvements.
### Dataset Details
The structure of the main folder:
```text
[Main folder]
├── p01
├── p02
├── ...
├── p16
└── participant-overview.xlsx
```
Each participant's folder (pXX) contains:
- `fitbit` [folder]
- `calories.json`: Shows how many calories the person have burned the last minute.
- `distance.json`: Gives the distance moved per minute. Distance seems to be in centimeters.
- `exercise.json`: Describes each activity in more detail. It contains the date with start and stop time, time in different activity levels, type of activity and various performance metrics depending a bit on type of exercise, e.g., for running, it contains distance, time, steps, calories, speed and pace.
- `heart_rate.json`: Shows the number of heart beats per minute (bpm) at a given time.
- `lightly_active_minutes.json`: Sums up the number of lightly active minutes per day.
- `moderately_active_minutes.json`: Sums up the number of moderately active minutes per day.
- `resting_heart_rate.json`: Gives the resting heart rate per day.
- `sedentary_minutes.json`: Sums up the number of sedentary minutes per day.
- `sleep_score.csv`: Helps understand the sleep each night so you can see trends in the sleep patterns. It contains an overall 0-100 score made up from composition, revitalization and duration scores, the number of deep sleep minutes, the resting heart rate and a restlessness score.
- `sleep.json`: A per sleep breakdown of the sleep into periods of light, deep, rem sleeps and time awake.
- `steps.json`: Displays the number of steps per minute.
- `time_in_heart_rate_zones.json`: Gives the number of minutes in different heart rate zones. Using the common formula of 220 minus your age, Fitbit will calculate your maximum heart rate and then create three target heart rate zones fat burn (50 to 69 percent of your max heart rate), cardio (70 to 84 percent of your max heart rate), and peak (85 to 100 percent of your max heart rate) - based off that number.
- `very_active_minutes.json`: Sums up the number of very active minutes per day.
- `googledocs` [folder]
- `reporting.csv`: Contains one line per report including the date reported for, a timestamp of the report submission time, the eaten meals (breakfast, lunch, dinner and evening meal), the participants weight this day, the number of glasses drunk, and whether one has consumed alcohol.
- `pmsys` [folder]
- `injury.csv`: Shows injuries with a time and date and corresponding injury locations and a minor and major severity.
- `srpe.csv`: Contains a training session’s end-time, type of activity, the perceived exertion (RPE), and the duration in the number of minutes. This is, for example, used to calculate the sessions training load or sRPE (RPE×duration).
- `wellness.csv`: Includes parameters like time and date, fatigue, mood, readiness, sleep duration (number of hours), sleep quality, soreness (and soreness area), and stress. Fatigue, sleep qual-ity, soreness, stress, and mood all have a 1-5 scale. The score 3 is normal, and 1-2 are scores below normal and 4-5 are scores above normal. Sleep length is just a measure of how long the sleep was in hours, and readiness (scale 0-10) is an overall subjective measure of how ready are you to exercise, i.e., 0 means not ready at all and 10 indicates that you cannot feel any better and are ready for anything!
- `food-images.zip`: Participants 1, 3 and 5 have taken pictures of everything they have eaten except water during 2 months (February and March). There are food images included in this .zip file, and information about day and time is given in the image header. The participants used their own mobile cameras to collect the images (Iphone 6s, Iphone X and Iphone XS). The standard export function of the MacOS Photos software with full quality was used to export the images.
### Term of use
The license for the PMData dataset is Attribution-NonCommercial 4.0 International. More information can be found here: <https://creativecommons.org/licenses/by-nc/4.0/legalcode>
### Citation
```bibtex
@inproceedings{10.1145/3339825.3394926,
address = {New York, NY, USA},
author = {Thambawita, Vajira and Hicks, Steven Alexander and Borgli, Hanna and Stensland, H\r{a}kon Kvale and Jha, Debesh and Svensen, Martin Kristoffer and Pettersen, Svein-Arne and Johansen, Dag and Johansen, H\r{a}vard Dagenborg and Pettersen, Susann Dahl and Nordvang, Simon and Pedersen, Sigurd and Gjerdrum, Anders and Gr\o{}nli, Tor-Morten and Fredriksen, Per Morten and Eg, Ragnhild and Hansen, Kjeld and Fagernes, Siri and Claudi, Christine and Bi\o{}rn-Hansen, Andreas and Nguyen, Duc Tien Dang and Kupka, Tomas and Hammer, Hugo Lewi and Jain, Ramesh and Riegler, Michael Alexander and Halvorsen, P\r{a}l},
booktitle = {Proceedings of the 11th ACM Multimedia Systems Conference},
doi = {10.1145/3339825.3394926},
isbn = {9781450368452},
keywords = {sports logging, questionnaires, food pictures, neural networks, multimedia dataset, sensor data, machine learning},
location = {Istanbul, Turkey},
numpages = {6},
pages = {231-236},
publisher = {Association for Computing Machinery},
series = {MMSys '20},
title = {PMData: A Sports Logging Dataset},
url = {https://doi.org/10.1145/3339825.3394926},
year = {2020},
}
```
| aai530-group6/pmdata | [
"language:en",
"license:cc-by-4.0",
"health",
"region:us"
] | 2024-01-22T00:51:14+00:00 | {"language": ["en"], "license": "cc-by-4.0", "pretty_name": "pmdata", "tags": ["health"]} | 2024-01-22T03:55:50+00:00 | [] | [
"en"
] | TAGS
#language-English #license-cc-by-4.0 #health #region-us
|
# PMData Dataset
## About Dataset
Paper: <URL
In this dataset, we present the PMData dataset that aims to combine traditional lifelogging with sports activity logging. Such a dataset enables the development of several interesting analysis applications, e.g., where additional sports data can be used to predict and analyze everyday developments like a person's weight and sleep patterns, and where traditional lifelog data can be used in a sports context to predict an athletes performance. In this respect, we have used the Fitbit Versa 2 smartwatch wristband, the PMSys sports logging app a and Google forms for the data collection, and PMData contains logging data for 5 months from 16 persons. Our initial experiments show that such analyzes are possible, but there are still large rooms for improvements.
### Dataset Details
The structure of the main folder:
Each participant's folder (pXX) contains:
- 'fitbit' [folder]
- 'URL': Shows how many calories the person have burned the last minute.
- 'URL': Gives the distance moved per minute. Distance seems to be in centimeters.
- 'URL': Describes each activity in more detail. It contains the date with start and stop time, time in different activity levels, type of activity and various performance metrics depending a bit on type of exercise, e.g., for running, it contains distance, time, steps, calories, speed and pace.
- 'heart_rate.json': Shows the number of heart beats per minute (bpm) at a given time.
- 'lightly_active_minutes.json': Sums up the number of lightly active minutes per day.
- 'moderately_active_minutes.json': Sums up the number of moderately active minutes per day.
- 'resting_heart_rate.json': Gives the resting heart rate per day.
- 'sedentary_minutes.json': Sums up the number of sedentary minutes per day.
- 'sleep_score.csv': Helps understand the sleep each night so you can see trends in the sleep patterns. It contains an overall 0-100 score made up from composition, revitalization and duration scores, the number of deep sleep minutes, the resting heart rate and a restlessness score.
- 'URL': A per sleep breakdown of the sleep into periods of light, deep, rem sleeps and time awake.
- 'URL': Displays the number of steps per minute.
- 'time_in_heart_rate_zones.json': Gives the number of minutes in different heart rate zones. Using the common formula of 220 minus your age, Fitbit will calculate your maximum heart rate and then create three target heart rate zones fat burn (50 to 69 percent of your max heart rate), cardio (70 to 84 percent of your max heart rate), and peak (85 to 100 percent of your max heart rate) - based off that number.
- 'very_active_minutes.json': Sums up the number of very active minutes per day.
- 'googledocs' [folder]
- 'URL': Contains one line per report including the date reported for, a timestamp of the report submission time, the eaten meals (breakfast, lunch, dinner and evening meal), the participants weight this day, the number of glasses drunk, and whether one has consumed alcohol.
- 'pmsys' [folder]
- 'URL': Shows injuries with a time and date and corresponding injury locations and a minor and major severity.
- 'URL': Contains a training session’s end-time, type of activity, the perceived exertion (RPE), and the duration in the number of minutes. This is, for example, used to calculate the sessions training load or sRPE (RPE×duration).
- 'URL': Includes parameters like time and date, fatigue, mood, readiness, sleep duration (number of hours), sleep quality, soreness (and soreness area), and stress. Fatigue, sleep qual-ity, soreness, stress, and mood all have a 1-5 scale. The score 3 is normal, and 1-2 are scores below normal and 4-5 are scores above normal. Sleep length is just a measure of how long the sleep was in hours, and readiness (scale 0-10) is an overall subjective measure of how ready are you to exercise, i.e., 0 means not ready at all and 10 indicates that you cannot feel any better and are ready for anything!
- 'URL': Participants 1, 3 and 5 have taken pictures of everything they have eaten except water during 2 months (February and March). There are food images included in this .zip file, and information about day and time is given in the image header. The participants used their own mobile cameras to collect the images (Iphone 6s, Iphone X and Iphone XS). The standard export function of the MacOS Photos software with full quality was used to export the images.
### Term of use
The license for the PMData dataset is Attribution-NonCommercial 4.0 International. More information can be found here: <URL
| [
"# PMData Dataset",
"## About Dataset\n\nPaper: <URL\n\nIn this dataset, we present the PMData dataset that aims to combine traditional lifelogging with sports activity logging. Such a dataset enables the development of several interesting analysis applications, e.g., where additional sports data can be used to predict and analyze everyday developments like a person's weight and sleep patterns, and where traditional lifelog data can be used in a sports context to predict an athletes performance. In this respect, we have used the Fitbit Versa 2 smartwatch wristband, the PMSys sports logging app a and Google forms for the data collection, and PMData contains logging data for 5 months from 16 persons. Our initial experiments show that such analyzes are possible, but there are still large rooms for improvements.",
"### Dataset Details\n\nThe structure of the main folder:\n\n\n\nEach participant's folder (pXX) contains:\n\n- 'fitbit' [folder]\n - 'URL': Shows how many calories the person have burned the last minute.\n - 'URL': Gives the distance moved per minute. Distance seems to be in centimeters.\n - 'URL': Describes each activity in more detail. It contains the date with start and stop time, time in different activity levels, type of activity and various performance metrics depending a bit on type of exercise, e.g., for running, it contains distance, time, steps, calories, speed and pace.\n - 'heart_rate.json': Shows the number of heart beats per minute (bpm) at a given time.\n - 'lightly_active_minutes.json': Sums up the number of lightly active minutes per day.\n - 'moderately_active_minutes.json': Sums up the number of moderately active minutes per day.\n - 'resting_heart_rate.json': Gives the resting heart rate per day.\n - 'sedentary_minutes.json': Sums up the number of sedentary minutes per day.\n - 'sleep_score.csv': Helps understand the sleep each night so you can see trends in the sleep patterns. It contains an overall 0-100 score made up from composition, revitalization and duration scores, the number of deep sleep minutes, the resting heart rate and a restlessness score.\n - 'URL': A per sleep breakdown of the sleep into periods of light, deep, rem sleeps and time awake.\n - 'URL': Displays the number of steps per minute.\n - 'time_in_heart_rate_zones.json': Gives the number of minutes in different heart rate zones. Using the common formula of 220 minus your age, Fitbit will calculate your maximum heart rate and then create three target heart rate zones fat burn (50 to 69 percent of your max heart rate), cardio (70 to 84 percent of your max heart rate), and peak (85 to 100 percent of your max heart rate) - based off that number.\n - 'very_active_minutes.json': Sums up the number of very active minutes per day.\n\n- 'googledocs' [folder]\n - 'URL': Contains one line per report including the date reported for, a timestamp of the report submission time, the eaten meals (breakfast, lunch, dinner and evening meal), the participants weight this day, the number of glasses drunk, and whether one has consumed alcohol.\n\n- 'pmsys' [folder]\n - 'URL': Shows injuries with a time and date and corresponding injury locations and a minor and major severity.\n - 'URL': Contains a training session’s end-time, type of activity, the perceived exertion (RPE), and the duration in the number of minutes. This is, for example, used to calculate the sessions training load or sRPE (RPE×duration).\n - 'URL': Includes parameters like time and date, fatigue, mood, readiness, sleep duration (number of hours), sleep quality, soreness (and soreness area), and stress. Fatigue, sleep qual-ity, soreness, stress, and mood all have a 1-5 scale. The score 3 is normal, and 1-2 are scores below normal and 4-5 are scores above normal. Sleep length is just a measure of how long the sleep was in hours, and readiness (scale 0-10) is an overall subjective measure of how ready are you to exercise, i.e., 0 means not ready at all and 10 indicates that you cannot feel any better and are ready for anything!\n\n- 'URL': Participants 1, 3 and 5 have taken pictures of everything they have eaten except water during 2 months (February and March). There are food images included in this .zip file, and information about day and time is given in the image header. The participants used their own mobile cameras to collect the images (Iphone 6s, Iphone X and Iphone XS). The standard export function of the MacOS Photos software with full quality was used to export the images.",
"### Term of use\n\nThe license for the PMData dataset is Attribution-NonCommercial 4.0 International. More information can be found here: <URL"
] | [
"TAGS\n#language-English #license-cc-by-4.0 #health #region-us \n",
"# PMData Dataset",
"## About Dataset\n\nPaper: <URL\n\nIn this dataset, we present the PMData dataset that aims to combine traditional lifelogging with sports activity logging. Such a dataset enables the development of several interesting analysis applications, e.g., where additional sports data can be used to predict and analyze everyday developments like a person's weight and sleep patterns, and where traditional lifelog data can be used in a sports context to predict an athletes performance. In this respect, we have used the Fitbit Versa 2 smartwatch wristband, the PMSys sports logging app a and Google forms for the data collection, and PMData contains logging data for 5 months from 16 persons. Our initial experiments show that such analyzes are possible, but there are still large rooms for improvements.",
"### Dataset Details\n\nThe structure of the main folder:\n\n\n\nEach participant's folder (pXX) contains:\n\n- 'fitbit' [folder]\n - 'URL': Shows how many calories the person have burned the last minute.\n - 'URL': Gives the distance moved per minute. Distance seems to be in centimeters.\n - 'URL': Describes each activity in more detail. It contains the date with start and stop time, time in different activity levels, type of activity and various performance metrics depending a bit on type of exercise, e.g., for running, it contains distance, time, steps, calories, speed and pace.\n - 'heart_rate.json': Shows the number of heart beats per minute (bpm) at a given time.\n - 'lightly_active_minutes.json': Sums up the number of lightly active minutes per day.\n - 'moderately_active_minutes.json': Sums up the number of moderately active minutes per day.\n - 'resting_heart_rate.json': Gives the resting heart rate per day.\n - 'sedentary_minutes.json': Sums up the number of sedentary minutes per day.\n - 'sleep_score.csv': Helps understand the sleep each night so you can see trends in the sleep patterns. It contains an overall 0-100 score made up from composition, revitalization and duration scores, the number of deep sleep minutes, the resting heart rate and a restlessness score.\n - 'URL': A per sleep breakdown of the sleep into periods of light, deep, rem sleeps and time awake.\n - 'URL': Displays the number of steps per minute.\n - 'time_in_heart_rate_zones.json': Gives the number of minutes in different heart rate zones. Using the common formula of 220 minus your age, Fitbit will calculate your maximum heart rate and then create three target heart rate zones fat burn (50 to 69 percent of your max heart rate), cardio (70 to 84 percent of your max heart rate), and peak (85 to 100 percent of your max heart rate) - based off that number.\n - 'very_active_minutes.json': Sums up the number of very active minutes per day.\n\n- 'googledocs' [folder]\n - 'URL': Contains one line per report including the date reported for, a timestamp of the report submission time, the eaten meals (breakfast, lunch, dinner and evening meal), the participants weight this day, the number of glasses drunk, and whether one has consumed alcohol.\n\n- 'pmsys' [folder]\n - 'URL': Shows injuries with a time and date and corresponding injury locations and a minor and major severity.\n - 'URL': Contains a training session’s end-time, type of activity, the perceived exertion (RPE), and the duration in the number of minutes. This is, for example, used to calculate the sessions training load or sRPE (RPE×duration).\n - 'URL': Includes parameters like time and date, fatigue, mood, readiness, sleep duration (number of hours), sleep quality, soreness (and soreness area), and stress. Fatigue, sleep qual-ity, soreness, stress, and mood all have a 1-5 scale. The score 3 is normal, and 1-2 are scores below normal and 4-5 are scores above normal. Sleep length is just a measure of how long the sleep was in hours, and readiness (scale 0-10) is an overall subjective measure of how ready are you to exercise, i.e., 0 means not ready at all and 10 indicates that you cannot feel any better and are ready for anything!\n\n- 'URL': Participants 1, 3 and 5 have taken pictures of everything they have eaten except water during 2 months (February and March). There are food images included in this .zip file, and information about day and time is given in the image header. The participants used their own mobile cameras to collect the images (Iphone 6s, Iphone X and Iphone XS). The standard export function of the MacOS Photos software with full quality was used to export the images.",
"### Term of use\n\nThe license for the PMData dataset is Attribution-NonCommercial 4.0 International. More information can be found here: <URL"
] |
870e6dc75de4ae993b10e23fcc69837f94386cc6 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | LeeDaeSeong/HPMP | [
"region:us"
] | 2024-01-22T00:54:11+00:00 | {} | 2024-01-22T04:47:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1f8165f7dd18b232a77c0427f9f1fa51378e2bdc | # Dataset Card for CHOCOLATE
- [Dataset Description](https://huggingface.co/datasets/khhuang/CHOCOLATE/blob/main/README.md#dataset-description)
- [Paper Information](https://huggingface.co/datasets/khhuang/CHOCOLATE/blob/main/README.md#paper-information)
- [Citation](https://huggingface.co/datasets/khhuang/CHOCOLATE/blob/main/README.md#citation)
## Dataset Description
**CHOCOLATE** is a benchmark for detecting and correcting factual inconsistency in generated chart captions. It consists of captions produced by six most advanced models, which are categorized into three subsets:
- **LVLM**: GPT-4V, Bard (before Gemini)
- **LLM-based Pipeline**: DePlot + GPT-4
- **Fine-tuned Model**: ChartT5, MatCha, UniChart
The charts are from two datasets: VisText and the Pew split of Chart-to-Text. In total, **CHOCOLATE** consists of **1,187 examples**. Each instance in **CHOCOLATE** consists of a caption generated by one of the model and the annotations of the factual errors for each caption sentence.
## Paper Information
- Paper: https://arxiv.org/abs/2312.10160
- Code: https://github.com/khuangaf/CHOCOLATE/
- Project: https://khuangaf.github.io/CHOCOLATE
## Citation
If you use the **CHOCOLATE** dataset in your work, please kindly cite the paper using this BibTeX:
```
@misc{huang-etal-2023-do,
title = "Do LVLMs Understand Charts? Analyzing and Correcting Factual Errors in Chart Captioning",
author = "Huang, Kung-Hsiang and
Zhou, Mingyang and
Chan, Hou Pong and
Fung, Yi R. and
Wang, Zhenhailong and
Zhang, Lingyu and
Chang, Shih-Fu and
Ji, Heng",
year={2023},
eprint={2312.10160},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | khhuang/CHOCOLATE | [
"annotations_creators:expert-generated",
"annotations_creators:found",
"language_creators:expert-generated",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"chart",
"plot",
"chart-to-text",
"vistext",
"statista",
"pew",
"chart-understanding",
"chart-captioning",
"chart-summarization",
"document-image",
"arxiv:2312.10160",
"region:us"
] | 2024-01-22T01:27:40+00:00 | {"annotations_creators": ["expert-generated", "found"], "language_creators": ["expert-generated", "found"], "language": ["en"], "license": "apache-2.0", "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "paperswithcode_id": "chocolate", "pretty_name": "CHOCOLATE", "tags": ["chart", "plot", "chart-to-text", "vistext", "statista", "pew", "chart-understanding", "chart-captioning", "chart-summarization", "document-image"], "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "chocolate.json"}]}]} | 2024-01-22T06:04:42+00:00 | [
"2312.10160"
] | [
"en"
] | TAGS
#annotations_creators-expert-generated #annotations_creators-found #language_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #language-English #license-apache-2.0 #chart #plot #chart-to-text #vistext #statista #pew #chart-understanding #chart-captioning #chart-summarization #document-image #arxiv-2312.10160 #region-us
| # Dataset Card for CHOCOLATE
- Dataset Description
- Paper Information
- Citation
## Dataset Description
CHOCOLATE is a benchmark for detecting and correcting factual inconsistency in generated chart captions. It consists of captions produced by six most advanced models, which are categorized into three subsets:
- LVLM: GPT-4V, Bard (before Gemini)
- LLM-based Pipeline: DePlot + GPT-4
- Fine-tuned Model: ChartT5, MatCha, UniChart
The charts are from two datasets: VisText and the Pew split of Chart-to-Text. In total, CHOCOLATE consists of 1,187 examples. Each instance in CHOCOLATE consists of a caption generated by one of the model and the annotations of the factual errors for each caption sentence.
## Paper Information
- Paper: URL
- Code: URL
- Project: URL
If you use the CHOCOLATE dataset in your work, please kindly cite the paper using this BibTeX:
| [
"# Dataset Card for CHOCOLATE\n\n- Dataset Description\n- Paper Information\n- Citation",
"## Dataset Description\n\nCHOCOLATE is a benchmark for detecting and correcting factual inconsistency in generated chart captions. It consists of captions produced by six most advanced models, which are categorized into three subsets:\n\n- LVLM: GPT-4V, Bard (before Gemini)\n- LLM-based Pipeline: DePlot + GPT-4\n- Fine-tuned Model: ChartT5, MatCha, UniChart\n\n\nThe charts are from two datasets: VisText and the Pew split of Chart-to-Text. In total, CHOCOLATE consists of 1,187 examples. Each instance in CHOCOLATE consists of a caption generated by one of the model and the annotations of the factual errors for each caption sentence.",
"## Paper Information\n\n- Paper: URL\n- Code: URL\n- Project: URL\n\n\nIf you use the CHOCOLATE dataset in your work, please kindly cite the paper using this BibTeX:"
] | [
"TAGS\n#annotations_creators-expert-generated #annotations_creators-found #language_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #language-English #license-apache-2.0 #chart #plot #chart-to-text #vistext #statista #pew #chart-understanding #chart-captioning #chart-summarization #document-image #arxiv-2312.10160 #region-us \n",
"# Dataset Card for CHOCOLATE\n\n- Dataset Description\n- Paper Information\n- Citation",
"## Dataset Description\n\nCHOCOLATE is a benchmark for detecting and correcting factual inconsistency in generated chart captions. It consists of captions produced by six most advanced models, which are categorized into three subsets:\n\n- LVLM: GPT-4V, Bard (before Gemini)\n- LLM-based Pipeline: DePlot + GPT-4\n- Fine-tuned Model: ChartT5, MatCha, UniChart\n\n\nThe charts are from two datasets: VisText and the Pew split of Chart-to-Text. In total, CHOCOLATE consists of 1,187 examples. Each instance in CHOCOLATE consists of a caption generated by one of the model and the annotations of the factual errors for each caption sentence.",
"## Paper Information\n\n- Paper: URL\n- Code: URL\n- Project: URL\n\n\nIf you use the CHOCOLATE dataset in your work, please kindly cite the paper using this BibTeX:"
] |
a88419d84aa7192c680bb2116356e4b0ea135ee0 | I modified https://huggingface.co/datasets/IlyaGusev/pippa_scored so that it can be used as training data.
http://openerotica.etsy.com/
https://www.patreon.com/openerotica | openerotica/pippa_scored2sharegpt | [
"license:apache-2.0",
"region:us"
] | 2024-01-22T01:31:35+00:00 | {"license": "apache-2.0"} | 2024-01-22T01:44:19+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| I modified URL so that it can be used as training data.
URL
URL | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
063058fe9d1c385b10ec9d0febf7a10c3cf1ecda |
# Pandora RLHF
A Reinforcement Learning from Human Feedback (RLHF) dataset for Direct Preference Optimization (DPO) fine-tuning of the Pandora Large Language Model (LLM).
The dataset is based on the [anthropic/hh-rlhf](https://huggingface.co/datasets/anthropic/hh-rlhf) dataset.
## Copyright and license
Copyright (c) 2024, Danilo Peixoto Ferreira. All rights reserved.
Project developed under a [BSD-3-Clause license](LICENSE.md).
| danilopeixoto/pandora-rlhf | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"license:bsd-3-clause",
"dpo",
"fine-tuning",
"rlhf",
"region:us"
] | 2024-01-22T01:32:36+00:00 | {"license": "bsd-3-clause", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "pretty_name": "Pandora RLHF", "tags": ["dpo", "fine-tuning", "rlhf"]} | 2024-01-22T01:35:26+00:00 | [] | [] | TAGS
#task_categories-text-generation #size_categories-100K<n<1M #license-bsd-3-clause #dpo #fine-tuning #rlhf #region-us
|
# Pandora RLHF
A Reinforcement Learning from Human Feedback (RLHF) dataset for Direct Preference Optimization (DPO) fine-tuning of the Pandora Large Language Model (LLM).
The dataset is based on the anthropic/hh-rlhf dataset.
## Copyright and license
Copyright (c) 2024, Danilo Peixoto Ferreira. All rights reserved.
Project developed under a BSD-3-Clause license.
| [
"# Pandora RLHF\n\nA Reinforcement Learning from Human Feedback (RLHF) dataset for Direct Preference Optimization (DPO) fine-tuning of the Pandora Large Language Model (LLM).\n\nThe dataset is based on the anthropic/hh-rlhf dataset.",
"## Copyright and license\n\nCopyright (c) 2024, Danilo Peixoto Ferreira. All rights reserved.\n\nProject developed under a BSD-3-Clause license."
] | [
"TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #license-bsd-3-clause #dpo #fine-tuning #rlhf #region-us \n",
"# Pandora RLHF\n\nA Reinforcement Learning from Human Feedback (RLHF) dataset for Direct Preference Optimization (DPO) fine-tuning of the Pandora Large Language Model (LLM).\n\nThe dataset is based on the anthropic/hh-rlhf dataset.",
"## Copyright and license\n\nCopyright (c) 2024, Danilo Peixoto Ferreira. All rights reserved.\n\nProject developed under a BSD-3-Clause license."
] |
097d3ed2baeb2cbc02ca73f0fd9d2c9956a5e560 |
# Dataset Card for Evaluation run of Vasanth/Beast-Soul
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Vasanth/Beast-Soul](https://huggingface.co/Vasanth/Beast-Soul) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Vasanth__Beast-Soul",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T01:37:25.115466](https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Beast-Soul/blob/main/results_2024-01-22T01-37-25.115466.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6534978697420841,
"acc_stderr": 0.0320972689830544,
"acc_norm": 0.6529230512023926,
"acc_norm_stderr": 0.032767010735709305,
"mc1": 0.5140758873929009,
"mc1_stderr": 0.01749656371704278,
"mc2": 0.6675667113390573,
"mc2_stderr": 0.015196423862548429
},
"harness|arc:challenge|25": {
"acc": 0.6979522184300341,
"acc_stderr": 0.013417519144716417,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7096195976897033,
"acc_stderr": 0.0045301018699731915,
"acc_norm": 0.8814977096195977,
"acc_norm_stderr": 0.003225414119289712
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050876,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050876
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079069,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5140758873929009,
"mc1_stderr": 0.01749656371704278,
"mc2": 0.6675667113390573,
"mc2_stderr": 0.015196423862548429
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370623
},
"harness|gsm8k|5": {
"acc": 0.7058377558756633,
"acc_stderr": 0.012551285331470152
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Vasanth__Beast-Soul | [
"region:us"
] | 2024-01-22T01:39:43+00:00 | {"pretty_name": "Evaluation run of Vasanth/Beast-Soul", "dataset_summary": "Dataset automatically created during the evaluation run of model [Vasanth/Beast-Soul](https://huggingface.co/Vasanth/Beast-Soul) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Vasanth__Beast-Soul\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T01:37:25.115466](https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Beast-Soul/blob/main/results_2024-01-22T01-37-25.115466.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6534978697420841,\n \"acc_stderr\": 0.0320972689830544,\n \"acc_norm\": 0.6529230512023926,\n \"acc_norm_stderr\": 0.032767010735709305,\n \"mc1\": 0.5140758873929009,\n \"mc1_stderr\": 0.01749656371704278,\n \"mc2\": 0.6675667113390573,\n \"mc2_stderr\": 0.015196423862548429\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6979522184300341,\n \"acc_stderr\": 0.013417519144716417,\n \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7096195976897033,\n \"acc_stderr\": 0.0045301018699731915,\n \"acc_norm\": 0.8814977096195977,\n \"acc_norm_stderr\": 0.003225414119289712\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n \"acc_stderr\": 0.016607021781050876,\n \"acc_norm\": 0.441340782122905,\n \"acc_norm_stderr\": 0.016607021781050876\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079069,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079069\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5140758873929009,\n \"mc1_stderr\": 0.01749656371704278,\n \"mc2\": 0.6675667113390573,\n \"mc2_stderr\": 0.015196423862548429\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370623\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7058377558756633,\n \"acc_stderr\": 0.012551285331470152\n }\n}\n```", "repo_url": "https://huggingface.co/Vasanth/Beast-Soul", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|arc:challenge|25_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|gsm8k|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hellaswag|10_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T01-37-25.115466.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["**/details_harness|winogrande|5_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T01-37-25.115466.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T01_37_25.115466", "path": ["results_2024-01-22T01-37-25.115466.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T01-37-25.115466.parquet"]}]}]} | 2024-01-22T01:40:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Vasanth/Beast-Soul
Dataset automatically created during the evaluation run of model Vasanth/Beast-Soul on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T01:37:25.115466(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Vasanth/Beast-Soul\n\n\n\nDataset automatically created during the evaluation run of model Vasanth/Beast-Soul on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T01:37:25.115466(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Vasanth/Beast-Soul\n\n\n\nDataset automatically created during the evaluation run of model Vasanth/Beast-Soul on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T01:37:25.115466(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
95252d625728d3592265903306ccf4d49a4c4e83 | ---
language:
- en
license: odbl
tags:
- health
- heart-disease
- medical
- machine-learning
annotations_creators:
- expert-generated
language_creators:
- expert-generated
pretty_name: Heart Failure Prediction Dataset
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- structured-data-classification
task_ids:
- binary-classification
- health-data-analysis
paperswithcode_id: heart-failure-prediction
configs:
- default
dataset_info:
features:
- name: Age
dtype: int32
- name: Sex
dtype: string
- name: ChestPainType
dtype: string
- name: RestingBP
dtype: int32
- name: Cholesterol
dtype: int32
- name: FastingBS
dtype: int32
- name: RestingECG
dtype: string
- name: MaxHR
dtype: int32
- name: ExerciseAngina
dtype: string
- name: Oldpeak
dtype: float32
- name: ST_Slope
dtype: string
- name: HeartDisease
dtype: int32
config_name: default
splits:
- name: total
num_bytes: UNKNOWN
num_examples: 918
download_size: UNKNOWN
dataset_size: UNKNOWN
train-eval-index:
- config: default
task: structured-data-classification
task_id: binary-classification
splits:
train_split: train
eval_split: validation
col_mapping:
Age: Age
Sex: Sex
ChestPainType: ChestPainType
RestingBP: RestingBP
Cholesterol: Cholesterol
FastingBS: FastingBS
RestingECG: RestingECG
MaxHR: MaxHR
ExerciseAngina: ExerciseAngina
Oldpeak: Oldpeak
ST_Slope: ST_Slope
HeartDisease: HeartDisease
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 Score
| aai530-group6/heart-failure-prediction-dataset | [
"region:us"
] | 2024-01-22T01:46:08+00:00 | {} | 2024-01-22T02:16:19+00:00 | [] | [] | TAGS
#region-us
| ---
language:
- en
license: odbl
tags:
- health
- heart-disease
- medical
- machine-learning
annotations_creators:
- expert-generated
language_creators:
- expert-generated
pretty_name: Heart Failure Prediction Dataset
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- structured-data-classification
task_ids:
- binary-classification
- health-data-analysis
paperswithcode_id: heart-failure-prediction
configs:
- default
dataset_info:
features:
- name: Age
dtype: int32
- name: Sex
dtype: string
- name: ChestPainType
dtype: string
- name: RestingBP
dtype: int32
- name: Cholesterol
dtype: int32
- name: FastingBS
dtype: int32
- name: RestingECG
dtype: string
- name: MaxHR
dtype: int32
- name: ExerciseAngina
dtype: string
- name: Oldpeak
dtype: float32
- name: ST_Slope
dtype: string
- name: HeartDisease
dtype: int32
config_name: default
splits:
- name: total
num_bytes: UNKNOWN
num_examples: 918
download_size: UNKNOWN
dataset_size: UNKNOWN
train-eval-index:
- config: default
task: structured-data-classification
task_id: binary-classification
splits:
train_split: train
eval_split: validation
col_mapping:
Age: Age
Sex: Sex
ChestPainType: ChestPainType
RestingBP: RestingBP
Cholesterol: Cholesterol
FastingBS: FastingBS
RestingECG: RestingECG
MaxHR: MaxHR
ExerciseAngina: ExerciseAngina
Oldpeak: Oldpeak
ST_Slope: ST_Slope
HeartDisease: HeartDisease
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 Score
| [] | [
"TAGS\n#region-us \n"
] |
bc5530c11e28e3a879ae75413fc8b237ac3dc3e0 |
# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r128
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Stellaris-internlm2-20b-r128](https://huggingface.co/Weyaxi/Stellaris-internlm2-20b-r128) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r128",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T02:02:49.417178](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r128/blob/main/results_2024-01-22T02-02-49.417178.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6473729518927601,
"acc_stderr": 0.03165407821154855,
"acc_norm": 0.6587026634697054,
"acc_norm_stderr": 0.032520024648863555,
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347032,
"mc2": 0.5250446633100931,
"mc2_stderr": 0.015325921720538403
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.014438036220848029,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.014235872487909869
},
"harness|hellaswag|10": {
"acc": 0.6256721768571998,
"acc_stderr": 0.004829598101635788,
"acc_norm": 0.8174666401115316,
"acc_norm_stderr": 0.0038549403270910537
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.03353647469713839,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.03353647469713839
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6468085106382979,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.6468085106382979,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267822,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5812807881773399,
"acc_stderr": 0.034711928605184676,
"acc_norm": 0.5812807881773399,
"acc_norm_stderr": 0.034711928605184676
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548302,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.02777253333421896,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.02777253333421896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971114,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971114
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622842,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622842
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150877,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.01428337804429641,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.01428337804429641
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7427652733118971,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.7427652733118971,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472657,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347032,
"mc2": 0.5250446633100931,
"mc2_stderr": 0.015325921720538403
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.009968715765479648
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492658
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r128 | [
"region:us"
] | 2024-01-22T02:04:52+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Stellaris-internlm2-20b-r128", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Stellaris-internlm2-20b-r128](https://huggingface.co/Weyaxi/Stellaris-internlm2-20b-r128) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r128\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T02:02:49.417178](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r128/blob/main/results_2024-01-22T02-02-49.417178.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6473729518927601,\n \"acc_stderr\": 0.03165407821154855,\n \"acc_norm\": 0.6587026634697054,\n \"acc_norm_stderr\": 0.032520024648863555,\n \"mc1\": 0.3427172582619339,\n \"mc1_stderr\": 0.016614949385347032,\n \"mc2\": 0.5250446633100931,\n \"mc2_stderr\": 0.015325921720538403\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848029,\n \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.014235872487909869\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6256721768571998,\n \"acc_stderr\": 0.004829598101635788,\n \"acc_norm\": 0.8174666401115316,\n \"acc_norm_stderr\": 0.0038549403270910537\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n \"acc_stderr\": 0.03353647469713839,\n \"acc_norm\": 0.7986111111111112,\n \"acc_norm_stderr\": 0.03353647469713839\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6468085106382979,\n \"acc_stderr\": 0.031245325202761926,\n \"acc_norm\": 0.6468085106382979,\n \"acc_norm_stderr\": 0.031245325202761926\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n \"acc_stderr\": 0.022037217340267822,\n \"acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.022037217340267822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5812807881773399,\n \"acc_stderr\": 0.034711928605184676,\n \"acc_norm\": 0.5812807881773399,\n \"acc_norm_stderr\": 0.034711928605184676\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548302,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548302\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.02777253333421896,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.02777253333421896\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971114,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971114\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.02794045713622842,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.02794045713622842\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.03021683101150877,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.03021683101150877\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.01428337804429641,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.01428337804429641\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7427652733118971,\n \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.7427652733118971,\n \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n \"acc_stderr\": 0.012761104871472657,\n \"acc_norm\": 0.4810951760104302,\n \"acc_norm_stderr\": 0.012761104871472657\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389844,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389844\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.025991117672813296,\n \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.025991117672813296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3427172582619339,\n \"mc1_stderr\": 0.016614949385347032,\n \"mc2\": 0.5250446633100931,\n \"mc2_stderr\": 0.015325921720538403\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.009968715765479648\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492658\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Stellaris-internlm2-20b-r128", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-02-49.417178.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["**/details_harness|winogrande|5_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T02-02-49.417178.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T02_02_49.417178", "path": ["results_2024-01-22T02-02-49.417178.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T02-02-49.417178.parquet"]}]}]} | 2024-01-22T02:05:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r128
Dataset automatically created during the evaluation run of model Weyaxi/Stellaris-internlm2-20b-r128 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T02:02:49.417178(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r128\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Stellaris-internlm2-20b-r128 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:02:49.417178(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r128\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Stellaris-internlm2-20b-r128 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:02:49.417178(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2a3f63eb685b9948add564da8d4938534bfac1af |
# Dataset Card for Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31](https://huggingface.co/kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T02:13:58.257879](https://huggingface.co/datasets/open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31/blob/main/results_2024-01-22T02-13-58.257879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5330985732271527,
"acc_stderr": 0.034185007803077,
"acc_norm": 0.5352323665963996,
"acc_norm_stderr": 0.034920748737001794,
"mc1": 0.35006119951040393,
"mc1_stderr": 0.01669794942015103,
"mc2": 0.5134609475665187,
"mc2_stderr": 0.014908191115467387
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.014456862944650649,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693028
},
"harness|hellaswag|10": {
"acc": 0.6441943835889266,
"acc_stderr": 0.004777782584817781,
"acc_norm": 0.8419637522405895,
"acc_norm_stderr": 0.003640294912838683
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723463,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723463
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171451,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171451
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.02757596072327824,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.02757596072327824
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.03430462416103872,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.03430462416103872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4794871794871795,
"acc_stderr": 0.025329663163489943,
"acc_norm": 0.4794871794871795,
"acc_norm_stderr": 0.025329663163489943
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46218487394957986,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.46218487394957986,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6458715596330276,
"acc_stderr": 0.020504729013829114,
"acc_norm": 0.6458715596330276,
"acc_norm_stderr": 0.020504729013829114
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560524,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.032566854844603886,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.032566854844603886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.0484674825397724,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.0484674825397724
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7650063856960408,
"acc_stderr": 0.015162024152278434,
"acc_norm": 0.7650063856960408,
"acc_norm_stderr": 0.015162024152278434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.01520103251252044,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.01520103251252044
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.0282135041778241,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.0282135041778241
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.02809924077580955,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.02809924077580955
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39960886571056065,
"acc_stderr": 0.01251018163696068,
"acc_norm": 0.39960886571056065,
"acc_norm_stderr": 0.01251018163696068
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003483,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003483
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.01993362777685742,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.01993362777685742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35006119951040393,
"mc1_stderr": 0.01669794942015103,
"mc2": 0.5134609475665187,
"mc2_stderr": 0.014908191115467387
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.01056902112282591
},
"harness|gsm8k|5": {
"acc": 0.34268385140257773,
"acc_stderr": 0.01307303023082791
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31 | [
"region:us"
] | 2024-01-22T02:16:14+00:00 | {"pretty_name": "Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31", "dataset_summary": "Dataset automatically created during the evaluation run of model [kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31](https://huggingface.co/kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T02:13:58.257879](https://huggingface.co/datasets/open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31/blob/main/results_2024-01-22T02-13-58.257879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5330985732271527,\n \"acc_stderr\": 0.034185007803077,\n \"acc_norm\": 0.5352323665963996,\n \"acc_norm_stderr\": 0.034920748737001794,\n \"mc1\": 0.35006119951040393,\n \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5134609475665187,\n \"mc2_stderr\": 0.014908191115467387\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693028\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6441943835889266,\n \"acc_stderr\": 0.004777782584817781,\n \"acc_norm\": 0.8419637522405895,\n \"acc_norm_stderr\": 0.003640294912838683\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723463,\n \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723463\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171451,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171451\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n \"acc_stderr\": 0.02757596072327824,\n \"acc_norm\": 0.6225806451612903,\n \"acc_norm_stderr\": 0.02757596072327824\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.03430462416103872,\n \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.03430462416103872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.03275264467791516,\n \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.03275264467791516\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.025329663163489943,\n \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.025329663163489943\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.032385469487589795,\n \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.032385469487589795\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6458715596330276,\n \"acc_stderr\": 0.020504729013829114,\n \"acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.020504729013829114\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560524,\n \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560524\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.0484674825397724,\n \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.0484674825397724\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.7606837606837606,\n \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7650063856960408,\n \"acc_stderr\": 0.015162024152278434,\n \"acc_norm\": 0.7650063856960408,\n \"acc_norm_stderr\": 0.015162024152278434\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n \"acc_stderr\": 0.01520103251252044,\n \"acc_norm\": 0.2916201117318436,\n \"acc_norm_stderr\": 0.01520103251252044\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.0282135041778241,\n \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.0282135041778241\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n \"acc_stderr\": 0.02809924077580955,\n \"acc_norm\": 0.572347266881029,\n \"acc_norm_stderr\": 0.02809924077580955\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39960886571056065,\n \"acc_stderr\": 0.01251018163696068,\n \"acc_norm\": 0.39960886571056065,\n \"acc_norm_stderr\": 0.01251018163696068\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003483,\n \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003483\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.01993362777685742,\n \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.01993362777685742\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.031987615467631264,\n \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.031987615467631264\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.6368159203980099,\n \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35006119951040393,\n \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5134609475665187,\n \"mc2_stderr\": 0.014908191115467387\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.01056902112282591\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34268385140257773,\n \"acc_stderr\": 0.01307303023082791\n }\n}\n```", "repo_url": "https://huggingface.co/kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-13-58.257879.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["**/details_harness|winogrande|5_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T02-13-58.257879.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T02_13_58.257879", "path": ["results_2024-01-22T02-13-58.257879.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T02-13-58.257879.parquet"]}]}]} | 2024-01-22T02:16:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31
Dataset automatically created during the evaluation run of model kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T02:13:58.257879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31\n\n\n\nDataset automatically created during the evaluation run of model kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:13:58.257879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31\n\n\n\nDataset automatically created during the evaluation run of model kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:13:58.257879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
da97cf8cf1416aa75f836a2c35b0f7f4bbb090e0 |
# Dataset Card for Evaluation run of andysalerno/cloudymixtral7Bx2-nectar-0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/cloudymixtral7Bx2-nectar-0.2](https://huggingface.co/andysalerno/cloudymixtral7Bx2-nectar-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__cloudymixtral7Bx2-nectar-0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T02:17:36.925599](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__cloudymixtral7Bx2-nectar-0.2/blob/main/results_2024-01-22T02-17-36.925599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6411500131859755,
"acc_stderr": 0.03188163161208531,
"acc_norm": 0.6539831613919124,
"acc_norm_stderr": 0.032683317989685615,
"mc1": 0.5226438188494492,
"mc1_stderr": 0.017485542258489636,
"mc2": 0.6873292641569112,
"mc2_stderr": 0.015222039787426868
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.01396014260059868,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.6092411870145389,
"acc_stderr": 0.004869232758103324,
"acc_norm": 0.8077076279625572,
"acc_norm_stderr": 0.003932960974008082
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289736,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406953,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406953
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4547486033519553,
"acc_stderr": 0.016653875777524012,
"acc_norm": 0.4547486033519553,
"acc_norm_stderr": 0.016653875777524012
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045704,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045704
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.0283329595140312,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.0283329595140312
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5226438188494492,
"mc1_stderr": 0.017485542258489636,
"mc2": 0.6873292641569112,
"mc2_stderr": 0.015222039787426868
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998285
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.0029206661987887282
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_andysalerno__cloudymixtral7Bx2-nectar-0.2 | [
"region:us"
] | 2024-01-22T02:17:24+00:00 | {"pretty_name": "Evaluation run of andysalerno/cloudymixtral7Bx2-nectar-0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/cloudymixtral7Bx2-nectar-0.2](https://huggingface.co/andysalerno/cloudymixtral7Bx2-nectar-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__cloudymixtral7Bx2-nectar-0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T02:17:36.925599](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__cloudymixtral7Bx2-nectar-0.2/blob/main/results_2024-01-22T02-17-36.925599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6411500131859755,\n \"acc_stderr\": 0.03188163161208531,\n \"acc_norm\": 0.6539831613919124,\n \"acc_norm_stderr\": 0.032683317989685615,\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.017485542258489636,\n \"mc2\": 0.6873292641569112,\n \"mc2_stderr\": 0.015222039787426868\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.01396014260059868,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6092411870145389,\n \"acc_stderr\": 0.004869232758103324,\n \"acc_norm\": 0.8077076279625572,\n \"acc_norm_stderr\": 0.003932960974008082\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289736,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289736\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406953,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406953\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n \"acc_stderr\": 0.016653875777524012,\n \"acc_norm\": 0.4547486033519553,\n \"acc_norm_stderr\": 0.016653875777524012\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.0283329595140312,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.0283329595140312\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.017485542258489636,\n \"mc2\": 0.6873292641569112,\n \"mc2_stderr\": 0.015222039787426868\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998285\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \"acc_stderr\": 0.0029206661987887282\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/cloudymixtral7Bx2-nectar-0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-15-08.544766.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-17-36.925599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["**/details_harness|winogrande|5_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["**/details_harness|winogrande|5_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T02-17-36.925599.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T02_15_08.544766", "path": ["results_2024-01-22T02-15-08.544766.parquet"]}, {"split": "2024_01_22T02_17_36.925599", "path": ["results_2024-01-22T02-17-36.925599.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T02-17-36.925599.parquet"]}]}]} | 2024-01-22T02:20:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of andysalerno/cloudymixtral7Bx2-nectar-0.2
Dataset automatically created during the evaluation run of model andysalerno/cloudymixtral7Bx2-nectar-0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T02:17:36.925599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of andysalerno/cloudymixtral7Bx2-nectar-0.2\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/cloudymixtral7Bx2-nectar-0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:17:36.925599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of andysalerno/cloudymixtral7Bx2-nectar-0.2\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/cloudymixtral7Bx2-nectar-0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:17:36.925599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f688c393221bbb87bf676e8a5ad33841dd23e92a |
# Dataset Description
We are releasing under the CC-BY licence a new large-scale dataset for Automatic Symptom Detection (ASD) and Automatic Diagnosis (AD) systems in the medical domain. The dataset contains patients synthesized using a proprietary medical knowledge base and a commercial rule-based AD system. Patients in the dataset are characterized by their socio-demographic data, a pathology they are suffering from, a set of symptoms and antecedents related to this pathology, and a differential diagnosis. The symptoms and antecedents can be binary, categorical and multi-choice, with the potential of leading to more efficient and natural interactions between ASD/AD systems and patients. To the best of our knowledge, this is the first large-scale dataset that includes the differential diagnosis, and non-binary symptoms and antecedents.
**Note**: We use evidence as a general term to refer to a symptom or an antecedent.
This directory contains the following files:
- **release_evidences.json**: a JSON file describing all possible evidences considered in the dataset.
- **release_conditions.json**: a JSON file describing all pathologies considered in the dataset.
- **release_train_patients.zip**: a CSV file containing the patients of the training set.
- **release_validate_patients.zip**: a CSV file containing the patients of the validation set.
- **release_test_patients.zip**: a CSV file containing the patients of the test set.
## Evidence Description
Each evidence in the `release_evidences.json` file is described using the following entries:
- **name**: name of the evidence.
- **code_question**: a code allowing to identify which evidences are related. Evidences having the same `code_question` form a group of related symptoms. The value of the `code_question` refers to the evidence that need to be simulated/activated for the other members of the group to be eventually simulated.
- **question_fr**: the query, in French, associated to the evidence.
- **question_en**: the query, in English, associated to the evidence.
- **is_antecedent**: a flag indicating whether the evidence is an antecedent or a symptom.
- **data_type**: the type of evidence. We use `B` for binary, `C` for categorical, and `M` for multi-choice evidences.
- **default_value**: the default value of the evidence. If this value is used to characterize the evidence, then it is as if the evidence was not synthesized.
- **possible-values**: the possible values for the evidences. Only valid for categorical and multi-choice evidences.
- **value_meaning**: The meaning, in French and English, of each code that is part of the `possible-values` field. Only valid for categorical and multi-choice evidences.
## Pathology Description
The file `release_conditions.json` contains information about the pathologies that patients in the datasets may suffer from. Each pathology has the following attributes:
- **condition_name**: name of the pathology.
- **cond-name-fr**: name of the pathology in French.
- **cond-name-eng**: name of the pathology in English.
- **icd10-id**: ICD-10 code of the pathology.
- **severity**: the severity associated with the pathology. The lower the more severe.
- **symptoms**: data structure describing the set of symptoms characterizing the pathology. Each symptom is represented by its corresponding `name` entry in the `release_evidences.json` file.
- **antecedents**: data structure describing the set of antecedents characterizing the pathology. Each antecedent is represented by its corresponding `name` entry in the `release_evidences.json` file.
## Patient Description
Each patient in each of the 3 sets has the following attributes:
- **AGE**: the age of the synthesized patient.
- **SEX**: the sex of the synthesized patient.
- **PATHOLOGY**: name of the ground truth pathology (`condition_name` property in the `release_conditions.json` file) that the synthesized patient is suffering from.
- **EVIDENCES**: list of evidences experienced by the patient. An evidence can either be binary, categorical or multi-choice. A categorical or multi-choice evidence is represented in the format `[evidence-name]_@_[evidence-value]` where [`evidence-name`] is the name of the evidence (`name` entry in the `release_evidences.json` file) and [`evidence-value`] is a value from the `possible-values` entry. Note that for a multi-choice evidence, it is possible to have several `[evidence-name]_@_[evidence-value]` items in the evidence list, with each item being associated with a different evidence value. A binary evidence is represented as `[evidence-name]`.
- **INITIAL_EVIDENCE**: the evidence provided by the patient to kick-start an interaction with an ASD/AD system. This is useful during model evaluation for a fair comparison of ASD/AD systems as they will all begin an interaction with a given patient from the same starting point. The initial evidence is randomly selected from the binary evidences found in the evidence list mentioned above (i.e., `EVIDENCES`) and it is part of this list.
- **DIFFERENTIAL_DIAGNOSIS**: The ground truth differential diagnosis for the patient. It is represented as a list of pairs of the form `[[patho_1, proba_1], [patho_2, proba_2], ...]` where `patho_i` is the pathology name (`condition_name` entry in the `release_conditions.json` file) and `proba_i` is its related probability.
## Note:
We hope this dataset will encourage future works for ASD and AD systems that consider the differential diagnosis and the severity of pathologies. It is important to keep in mind that this dataset is formed of synthetic patients and is meant for research purposes. Given the assumptions made during the generation process of this dataset, we would like to emphasize that the dataset should not be used to train and deploy a model prior to performing rigorous evaluations of the model performance and verifying that the system has proper coverage and representation of the population that it will interact with.
It is important to understand that the level of specificity, sensitivity and confidence that a physician will seek when evaluating a patient will be influenced by the clinical setting. The dataset was built for acute care and biased toward high mortality and morbidity pathologies. Physicians will tend to consider negative evidences as equally important in such a clinical context in order to evaluate high acuity diseases.
In the creation of the DDXPlus dataset, a small subset of the diseases was chosen to establish a baseline. Medical professionals have to consider this very important point when reviewing the results of models trained with this dataset, as the differential is considerably smaller. A smaller differential means less potential evidences to collect. It is thus essential to understand this point when we look at the differential produced and the evidence collected by a model based on this dataset.
For more information, please check our [paper](https://arxiv.org/abs/2205.09148). | aai530-group6/ddxplus-french | [
"task_categories:tabular-classification",
"task_ids:multi-class-classification",
"size_categories:1K<n<10K",
"source_datasets:original",
"language:fr",
"license:cc-by-4.0",
"automatic-diagnosis",
"automatic-symptom-detection",
"differential-diagnosis",
"synthetic-patients",
"diseases",
"health-care",
"arxiv:2205.09148",
"region:us"
] | 2024-01-22T02:17:41+00:00 | {"language": ["fr"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "source_datasets": ["original"], "task_categories": ["tabular-classification"], "task_ids": ["multi-class-classification"], "paperswithcode_id": "ddxplus", "pretty_name": "DDXPlus", "license_link": "https://creativecommons.org/licenses/by/4.0/", "tags": ["automatic-diagnosis", "automatic-symptom-detection", "differential-diagnosis", "synthetic-patients", "diseases", "health-care"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "train.csv"}, {"split": "test", "path": "test.csv"}, {"split": "validate", "path": "validate.csv"}]}], "extra_gated_prompt": "By accessing this dataset, you agree to use it solely for research purposes and not for clinical decision-making.", "extra_gated_fields": {"Consent": "checkbox", "Purpose of use": {"type": "select", "options": ["Research", "Educational", {"label": "Other", "value": "other"}]}}, "train-eval-index": [{"config": "default", "task": "medical-diagnosis", "task_id": "binary-classification", "splits": {"train_split": "train", "eval_split": "validate"}, "col_mapping": {"AGE": "AGE", "SEX": "SEX", "PATHOLOGY": "PATHOLOGY", "EVIDENCES": "EVIDENCES", "INITIAL_EVIDENCE": "INITIAL_EVIDENCE", "DIFFERENTIAL_DIAGNOSIS": "DIFFERENTIAL_DIAGNOSIS"}, "metrics": [{"type": "accuracy", "name": "Accuracy"}, {"type": "f1", "name": "F1 Score"}]}]} | 2024-01-22T03:35:29+00:00 | [
"2205.09148"
] | [
"fr"
] | TAGS
#task_categories-tabular-classification #task_ids-multi-class-classification #size_categories-1K<n<10K #source_datasets-original #language-French #license-cc-by-4.0 #automatic-diagnosis #automatic-symptom-detection #differential-diagnosis #synthetic-patients #diseases #health-care #arxiv-2205.09148 #region-us
|
# Dataset Description
We are releasing under the CC-BY licence a new large-scale dataset for Automatic Symptom Detection (ASD) and Automatic Diagnosis (AD) systems in the medical domain. The dataset contains patients synthesized using a proprietary medical knowledge base and a commercial rule-based AD system. Patients in the dataset are characterized by their socio-demographic data, a pathology they are suffering from, a set of symptoms and antecedents related to this pathology, and a differential diagnosis. The symptoms and antecedents can be binary, categorical and multi-choice, with the potential of leading to more efficient and natural interactions between ASD/AD systems and patients. To the best of our knowledge, this is the first large-scale dataset that includes the differential diagnosis, and non-binary symptoms and antecedents.
Note: We use evidence as a general term to refer to a symptom or an antecedent.
This directory contains the following files:
- release_evidences.json: a JSON file describing all possible evidences considered in the dataset.
- release_conditions.json: a JSON file describing all pathologies considered in the dataset.
- release_train_patients.zip: a CSV file containing the patients of the training set.
- release_validate_patients.zip: a CSV file containing the patients of the validation set.
- release_test_patients.zip: a CSV file containing the patients of the test set.
## Evidence Description
Each evidence in the 'release_evidences.json' file is described using the following entries:
- name: name of the evidence.
- code_question: a code allowing to identify which evidences are related. Evidences having the same 'code_question' form a group of related symptoms. The value of the 'code_question' refers to the evidence that need to be simulated/activated for the other members of the group to be eventually simulated.
- question_fr: the query, in French, associated to the evidence.
- question_en: the query, in English, associated to the evidence.
- is_antecedent: a flag indicating whether the evidence is an antecedent or a symptom.
- data_type: the type of evidence. We use 'B' for binary, 'C' for categorical, and 'M' for multi-choice evidences.
- default_value: the default value of the evidence. If this value is used to characterize the evidence, then it is as if the evidence was not synthesized.
- possible-values: the possible values for the evidences. Only valid for categorical and multi-choice evidences.
- value_meaning: The meaning, in French and English, of each code that is part of the 'possible-values' field. Only valid for categorical and multi-choice evidences.
## Pathology Description
The file 'release_conditions.json' contains information about the pathologies that patients in the datasets may suffer from. Each pathology has the following attributes:
- condition_name: name of the pathology.
- cond-name-fr: name of the pathology in French.
- cond-name-eng: name of the pathology in English.
- icd10-id: ICD-10 code of the pathology.
- severity: the severity associated with the pathology. The lower the more severe.
- symptoms: data structure describing the set of symptoms characterizing the pathology. Each symptom is represented by its corresponding 'name' entry in the 'release_evidences.json' file.
- antecedents: data structure describing the set of antecedents characterizing the pathology. Each antecedent is represented by its corresponding 'name' entry in the 'release_evidences.json' file.
## Patient Description
Each patient in each of the 3 sets has the following attributes:
- AGE: the age of the synthesized patient.
- SEX: the sex of the synthesized patient.
- PATHOLOGY: name of the ground truth pathology ('condition_name' property in the 'release_conditions.json' file) that the synthesized patient is suffering from.
- EVIDENCES: list of evidences experienced by the patient. An evidence can either be binary, categorical or multi-choice. A categorical or multi-choice evidence is represented in the format '[evidence-name]_@_[evidence-value]' where ['evidence-name'] is the name of the evidence ('name' entry in the 'release_evidences.json' file) and ['evidence-value'] is a value from the 'possible-values' entry. Note that for a multi-choice evidence, it is possible to have several '[evidence-name]_@_[evidence-value]' items in the evidence list, with each item being associated with a different evidence value. A binary evidence is represented as '[evidence-name]'.
- INITIAL_EVIDENCE: the evidence provided by the patient to kick-start an interaction with an ASD/AD system. This is useful during model evaluation for a fair comparison of ASD/AD systems as they will all begin an interaction with a given patient from the same starting point. The initial evidence is randomly selected from the binary evidences found in the evidence list mentioned above (i.e., 'EVIDENCES') and it is part of this list.
- DIFFERENTIAL_DIAGNOSIS: The ground truth differential diagnosis for the patient. It is represented as a list of pairs of the form '[[patho_1, proba_1], [patho_2, proba_2], ...]' where 'patho_i' is the pathology name ('condition_name' entry in the 'release_conditions.json' file) and 'proba_i' is its related probability.
## Note:
We hope this dataset will encourage future works for ASD and AD systems that consider the differential diagnosis and the severity of pathologies. It is important to keep in mind that this dataset is formed of synthetic patients and is meant for research purposes. Given the assumptions made during the generation process of this dataset, we would like to emphasize that the dataset should not be used to train and deploy a model prior to performing rigorous evaluations of the model performance and verifying that the system has proper coverage and representation of the population that it will interact with.
It is important to understand that the level of specificity, sensitivity and confidence that a physician will seek when evaluating a patient will be influenced by the clinical setting. The dataset was built for acute care and biased toward high mortality and morbidity pathologies. Physicians will tend to consider negative evidences as equally important in such a clinical context in order to evaluate high acuity diseases.
In the creation of the DDXPlus dataset, a small subset of the diseases was chosen to establish a baseline. Medical professionals have to consider this very important point when reviewing the results of models trained with this dataset, as the differential is considerably smaller. A smaller differential means less potential evidences to collect. It is thus essential to understand this point when we look at the differential produced and the evidence collected by a model based on this dataset.
For more information, please check our paper. | [
"# Dataset Description\n\nWe are releasing under the CC-BY licence a new large-scale dataset for Automatic Symptom Detection (ASD) and Automatic Diagnosis (AD) systems in the medical domain. The dataset contains patients synthesized using a proprietary medical knowledge base and a commercial rule-based AD system. Patients in the dataset are characterized by their socio-demographic data, a pathology they are suffering from, a set of symptoms and antecedents related to this pathology, and a differential diagnosis. The symptoms and antecedents can be binary, categorical and multi-choice, with the potential of leading to more efficient and natural interactions between ASD/AD systems and patients. To the best of our knowledge, this is the first large-scale dataset that includes the differential diagnosis, and non-binary symptoms and antecedents.\n\nNote: We use evidence as a general term to refer to a symptom or an antecedent.\n\nThis directory contains the following files:\n - release_evidences.json: a JSON file describing all possible evidences considered in the dataset.\n - release_conditions.json: a JSON file describing all pathologies considered in the dataset.\n - release_train_patients.zip: a CSV file containing the patients of the training set.\n - release_validate_patients.zip: a CSV file containing the patients of the validation set.\n - release_test_patients.zip: a CSV file containing the patients of the test set.",
"## Evidence Description\n\nEach evidence in the 'release_evidences.json' file is described using the following entries:\n - name: name of the evidence.\n - code_question: a code allowing to identify which evidences are related. Evidences having the same 'code_question' form a group of related symptoms. The value of the 'code_question' refers to the evidence that need to be simulated/activated for the other members of the group to be eventually simulated.\n - question_fr: the query, in French, associated to the evidence.\n - question_en: the query, in English, associated to the evidence.\n - is_antecedent: a flag indicating whether the evidence is an antecedent or a symptom.\n - data_type: the type of evidence. We use 'B' for binary, 'C' for categorical, and 'M' for multi-choice evidences.\n - default_value: the default value of the evidence. If this value is used to characterize the evidence, then it is as if the evidence was not synthesized.\n - possible-values: the possible values for the evidences. Only valid for categorical and multi-choice evidences.\n - value_meaning: The meaning, in French and English, of each code that is part of the 'possible-values' field. Only valid for categorical and multi-choice evidences.",
"## Pathology Description\nThe file 'release_conditions.json' contains information about the pathologies that patients in the datasets may suffer from. Each pathology has the following attributes:\n - condition_name: name of the pathology.\n - cond-name-fr: name of the pathology in French.\n - cond-name-eng: name of the pathology in English.\n - icd10-id: ICD-10 code of the pathology.\n - severity: the severity associated with the pathology. The lower the more severe.\n - symptoms: data structure describing the set of symptoms characterizing the pathology. Each symptom is represented by its corresponding 'name' entry in the 'release_evidences.json' file.\n - antecedents: data structure describing the set of antecedents characterizing the pathology. Each antecedent is represented by its corresponding 'name' entry in the 'release_evidences.json' file.",
"## Patient Description\n\nEach patient in each of the 3 sets has the following attributes:\n - AGE: the age of the synthesized patient.\n - SEX: the sex of the synthesized patient.\n - PATHOLOGY: name of the ground truth pathology ('condition_name' property in the 'release_conditions.json' file) that the synthesized patient is suffering from.\n - EVIDENCES: list of evidences experienced by the patient. An evidence can either be binary, categorical or multi-choice. A categorical or multi-choice evidence is represented in the format '[evidence-name]_@_[evidence-value]' where ['evidence-name'] is the name of the evidence ('name' entry in the 'release_evidences.json' file) and ['evidence-value'] is a value from the 'possible-values' entry. Note that for a multi-choice evidence, it is possible to have several '[evidence-name]_@_[evidence-value]' items in the evidence list, with each item being associated with a different evidence value. A binary evidence is represented as '[evidence-name]'.\n - INITIAL_EVIDENCE: the evidence provided by the patient to kick-start an interaction with an ASD/AD system. This is useful during model evaluation for a fair comparison of ASD/AD systems as they will all begin an interaction with a given patient from the same starting point. The initial evidence is randomly selected from the binary evidences found in the evidence list mentioned above (i.e., 'EVIDENCES') and it is part of this list.\n - DIFFERENTIAL_DIAGNOSIS: The ground truth differential diagnosis for the patient. It is represented as a list of pairs of the form '[[patho_1, proba_1], [patho_2, proba_2], ...]' where 'patho_i' is the pathology name ('condition_name' entry in the 'release_conditions.json' file) and 'proba_i' is its related probability.",
"## Note:\n\nWe hope this dataset will encourage future works for ASD and AD systems that consider the differential diagnosis and the severity of pathologies. It is important to keep in mind that this dataset is formed of synthetic patients and is meant for research purposes. Given the assumptions made during the generation process of this dataset, we would like to emphasize that the dataset should not be used to train and deploy a model prior to performing rigorous evaluations of the model performance and verifying that the system has proper coverage and representation of the population that it will interact with.\n\nIt is important to understand that the level of specificity, sensitivity and confidence that a physician will seek when evaluating a patient will be influenced by the clinical setting. The dataset was built for acute care and biased toward high mortality and morbidity pathologies. Physicians will tend to consider negative evidences as equally important in such a clinical context in order to evaluate high acuity diseases.\n\nIn the creation of the DDXPlus dataset, a small subset of the diseases was chosen to establish a baseline. Medical professionals have to consider this very important point when reviewing the results of models trained with this dataset, as the differential is considerably smaller. A smaller differential means less potential evidences to collect. It is thus essential to understand this point when we look at the differential produced and the evidence collected by a model based on this dataset.\n\nFor more information, please check our paper."
] | [
"TAGS\n#task_categories-tabular-classification #task_ids-multi-class-classification #size_categories-1K<n<10K #source_datasets-original #language-French #license-cc-by-4.0 #automatic-diagnosis #automatic-symptom-detection #differential-diagnosis #synthetic-patients #diseases #health-care #arxiv-2205.09148 #region-us \n",
"# Dataset Description\n\nWe are releasing under the CC-BY licence a new large-scale dataset for Automatic Symptom Detection (ASD) and Automatic Diagnosis (AD) systems in the medical domain. The dataset contains patients synthesized using a proprietary medical knowledge base and a commercial rule-based AD system. Patients in the dataset are characterized by their socio-demographic data, a pathology they are suffering from, a set of symptoms and antecedents related to this pathology, and a differential diagnosis. The symptoms and antecedents can be binary, categorical and multi-choice, with the potential of leading to more efficient and natural interactions between ASD/AD systems and patients. To the best of our knowledge, this is the first large-scale dataset that includes the differential diagnosis, and non-binary symptoms and antecedents.\n\nNote: We use evidence as a general term to refer to a symptom or an antecedent.\n\nThis directory contains the following files:\n - release_evidences.json: a JSON file describing all possible evidences considered in the dataset.\n - release_conditions.json: a JSON file describing all pathologies considered in the dataset.\n - release_train_patients.zip: a CSV file containing the patients of the training set.\n - release_validate_patients.zip: a CSV file containing the patients of the validation set.\n - release_test_patients.zip: a CSV file containing the patients of the test set.",
"## Evidence Description\n\nEach evidence in the 'release_evidences.json' file is described using the following entries:\n - name: name of the evidence.\n - code_question: a code allowing to identify which evidences are related. Evidences having the same 'code_question' form a group of related symptoms. The value of the 'code_question' refers to the evidence that need to be simulated/activated for the other members of the group to be eventually simulated.\n - question_fr: the query, in French, associated to the evidence.\n - question_en: the query, in English, associated to the evidence.\n - is_antecedent: a flag indicating whether the evidence is an antecedent or a symptom.\n - data_type: the type of evidence. We use 'B' for binary, 'C' for categorical, and 'M' for multi-choice evidences.\n - default_value: the default value of the evidence. If this value is used to characterize the evidence, then it is as if the evidence was not synthesized.\n - possible-values: the possible values for the evidences. Only valid for categorical and multi-choice evidences.\n - value_meaning: The meaning, in French and English, of each code that is part of the 'possible-values' field. Only valid for categorical and multi-choice evidences.",
"## Pathology Description\nThe file 'release_conditions.json' contains information about the pathologies that patients in the datasets may suffer from. Each pathology has the following attributes:\n - condition_name: name of the pathology.\n - cond-name-fr: name of the pathology in French.\n - cond-name-eng: name of the pathology in English.\n - icd10-id: ICD-10 code of the pathology.\n - severity: the severity associated with the pathology. The lower the more severe.\n - symptoms: data structure describing the set of symptoms characterizing the pathology. Each symptom is represented by its corresponding 'name' entry in the 'release_evidences.json' file.\n - antecedents: data structure describing the set of antecedents characterizing the pathology. Each antecedent is represented by its corresponding 'name' entry in the 'release_evidences.json' file.",
"## Patient Description\n\nEach patient in each of the 3 sets has the following attributes:\n - AGE: the age of the synthesized patient.\n - SEX: the sex of the synthesized patient.\n - PATHOLOGY: name of the ground truth pathology ('condition_name' property in the 'release_conditions.json' file) that the synthesized patient is suffering from.\n - EVIDENCES: list of evidences experienced by the patient. An evidence can either be binary, categorical or multi-choice. A categorical or multi-choice evidence is represented in the format '[evidence-name]_@_[evidence-value]' where ['evidence-name'] is the name of the evidence ('name' entry in the 'release_evidences.json' file) and ['evidence-value'] is a value from the 'possible-values' entry. Note that for a multi-choice evidence, it is possible to have several '[evidence-name]_@_[evidence-value]' items in the evidence list, with each item being associated with a different evidence value. A binary evidence is represented as '[evidence-name]'.\n - INITIAL_EVIDENCE: the evidence provided by the patient to kick-start an interaction with an ASD/AD system. This is useful during model evaluation for a fair comparison of ASD/AD systems as they will all begin an interaction with a given patient from the same starting point. The initial evidence is randomly selected from the binary evidences found in the evidence list mentioned above (i.e., 'EVIDENCES') and it is part of this list.\n - DIFFERENTIAL_DIAGNOSIS: The ground truth differential diagnosis for the patient. It is represented as a list of pairs of the form '[[patho_1, proba_1], [patho_2, proba_2], ...]' where 'patho_i' is the pathology name ('condition_name' entry in the 'release_conditions.json' file) and 'proba_i' is its related probability.",
"## Note:\n\nWe hope this dataset will encourage future works for ASD and AD systems that consider the differential diagnosis and the severity of pathologies. It is important to keep in mind that this dataset is formed of synthetic patients and is meant for research purposes. Given the assumptions made during the generation process of this dataset, we would like to emphasize that the dataset should not be used to train and deploy a model prior to performing rigorous evaluations of the model performance and verifying that the system has proper coverage and representation of the population that it will interact with.\n\nIt is important to understand that the level of specificity, sensitivity and confidence that a physician will seek when evaluating a patient will be influenced by the clinical setting. The dataset was built for acute care and biased toward high mortality and morbidity pathologies. Physicians will tend to consider negative evidences as equally important in such a clinical context in order to evaluate high acuity diseases.\n\nIn the creation of the DDXPlus dataset, a small subset of the diseases was chosen to establish a baseline. Medical professionals have to consider this very important point when reviewing the results of models trained with this dataset, as the differential is considerably smaller. A smaller differential means less potential evidences to collect. It is thus essential to understand this point when we look at the differential produced and the evidence collected by a model based on this dataset.\n\nFor more information, please check our paper."
] |
d180f35e87fbc77d99133cc901ddc340fe78ad78 |
# Dataset Card for Evaluation run of antiven0m/brugle-rp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [antiven0m/brugle-rp](https://huggingface.co/antiven0m/brugle-rp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_antiven0m__brugle-rp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T02:19:10.123124](https://huggingface.co/datasets/open-llm-leaderboard/details_antiven0m__brugle-rp/blob/main/results_2024-01-22T02-19-10.123124.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_antiven0m__brugle-rp | [
"region:us"
] | 2024-01-22T02:21:29+00:00 | {"pretty_name": "Evaluation run of antiven0m/brugle-rp", "dataset_summary": "Dataset automatically created during the evaluation run of model [antiven0m/brugle-rp](https://huggingface.co/antiven0m/brugle-rp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_antiven0m__brugle-rp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T02:19:10.123124](https://huggingface.co/datasets/open-llm-leaderboard/details_antiven0m__brugle-rp/blob/main/results_2024-01-22T02-19-10.123124.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/antiven0m/brugle-rp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-19-10.123124.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["**/details_harness|winogrande|5_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T02-19-10.123124.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T02_19_10.123124", "path": ["results_2024-01-22T02-19-10.123124.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T02-19-10.123124.parquet"]}]}]} | 2024-01-22T02:21:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of antiven0m/brugle-rp
Dataset automatically created during the evaluation run of model antiven0m/brugle-rp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T02:19:10.123124(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of antiven0m/brugle-rp\n\n\n\nDataset automatically created during the evaluation run of model antiven0m/brugle-rp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:19:10.123124(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of antiven0m/brugle-rp\n\n\n\nDataset automatically created during the evaluation run of model antiven0m/brugle-rp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:19:10.123124(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b31041ceba7babbf46f267030b651bf4e53e3b1f |
# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r256
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Stellaris-internlm2-20b-r256](https://huggingface.co/Weyaxi/Stellaris-internlm2-20b-r256) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r256",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T02:30:15.872651](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r256/blob/main/results_2024-01-22T02-30-15.872651.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6508661042256001,
"acc_stderr": 0.03172035451202671,
"acc_norm": 0.6620431850943416,
"acc_norm_stderr": 0.03255546322018798,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626605,
"mc2": 0.5181219506294198,
"mc2_stderr": 0.015229145792254558
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520772,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6358295160326628,
"acc_stderr": 0.004802133511654238,
"acc_norm": 0.8222465644293966,
"acc_norm_stderr": 0.0038152372699611094
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810536,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810536
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6425531914893617,
"acc_stderr": 0.031329417894764254,
"acc_norm": 0.6425531914893617,
"acc_norm_stderr": 0.031329417894764254
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.02571523981134676,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.02571523981134676
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823074,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.034653044884067945,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.034653044884067945
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066468,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066468
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.01428337804429641,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.01428337804429641
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3754189944134078,
"acc_stderr": 0.01619510424846353,
"acc_norm": 0.3754189944134078,
"acc_norm_stderr": 0.01619510424846353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.752411575562701,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.752411575562701,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.012767793787729338,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.012767793787729338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.0189754279205072,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.0189754279205072
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.025607375986579157,
"acc_norm": 0.8,
"acc_norm_stderr": 0.025607375986579157
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626605,
"mc2": 0.5181219506294198,
"mc2_stderr": 0.015229145792254558
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.009968715765479653
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.0030152942428909434
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r256 | [
"region:us"
] | 2024-01-22T02:32:34+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Stellaris-internlm2-20b-r256", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Stellaris-internlm2-20b-r256](https://huggingface.co/Weyaxi/Stellaris-internlm2-20b-r256) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r256\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T02:30:15.872651](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r256/blob/main/results_2024-01-22T02-30-15.872651.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6508661042256001,\n \"acc_stderr\": 0.03172035451202671,\n \"acc_norm\": 0.6620431850943416,\n \"acc_norm_stderr\": 0.03255546322018798,\n \"mc1\": 0.3390452876376989,\n \"mc1_stderr\": 0.016571797910626605,\n \"mc2\": 0.5181219506294198,\n \"mc2_stderr\": 0.015229145792254558\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520772,\n \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6358295160326628,\n \"acc_stderr\": 0.004802133511654238,\n \"acc_norm\": 0.8222465644293966,\n \"acc_norm_stderr\": 0.0038152372699611094\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810536,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810536\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6425531914893617,\n \"acc_stderr\": 0.031329417894764254,\n \"acc_norm\": 0.6425531914893617,\n \"acc_norm_stderr\": 0.031329417894764254\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.02571523981134676,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.02571523981134676\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823074,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823074\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.034653044884067945,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.034653044884067945\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066468,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066468\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.01428337804429641,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.01428337804429641\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3754189944134078,\n \"acc_stderr\": 0.01619510424846353,\n \"acc_norm\": 0.3754189944134078,\n \"acc_norm_stderr\": 0.01619510424846353\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.752411575562701,\n \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.752411575562701,\n \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.023576881744005716,\n \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.023576881744005716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n \"acc_stderr\": 0.012767793787729338,\n \"acc_norm\": 0.49022164276401564,\n \"acc_norm_stderr\": 0.012767793787729338\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.0189754279205072,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.0189754279205072\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.025607375986579157,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.025607375986579157\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n \"mc1_stderr\": 0.016571797910626605,\n \"mc2\": 0.5181219506294198,\n \"mc2_stderr\": 0.015229145792254558\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.009968715765479653\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \"acc_stderr\": 0.0030152942428909434\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Stellaris-internlm2-20b-r256", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-30-15.872651.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["**/details_harness|winogrande|5_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T02-30-15.872651.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T02_30_15.872651", "path": ["results_2024-01-22T02-30-15.872651.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T02-30-15.872651.parquet"]}]}]} | 2024-01-22T02:32:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r256
Dataset automatically created during the evaluation run of model Weyaxi/Stellaris-internlm2-20b-r256 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T02:30:15.872651(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r256\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Stellaris-internlm2-20b-r256 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:30:15.872651(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r256\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Stellaris-internlm2-20b-r256 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:30:15.872651(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f61b365f6f335c967b4703868adaaf1e49905959 |
# Dataset Card for "mlqa"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/facebookresearch/MLQA](https://github.com/facebookresearch/MLQA)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 4.15 GB
- **Size of the generated dataset:** 910.01 MB
- **Total amount of disk used:** 5.06 GB
### Dataset Summary
MLQA (MultiLingual Question Answering) is a benchmark dataset for evaluating cross-lingual question answering performance.
MLQA consists of over 5K extractive QA instances (12K in English) in SQuAD format in seven languages - English, Arabic,
German, Spanish, Hindi, Vietnamese and Simplified Chinese. MLQA is highly parallel, with QA instances parallel between
4 different languages on average.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
MLQA contains QA instances in 7 languages, English, Arabic, German, Spanish, Hindi, Vietnamese and Simplified Chinese.
## Dataset Structure
### Data Instances
#### mlqa-translate-test.ar
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 5.48 MB
- **Total amount of disk used:** 15.56 MB
An example of 'test' looks as follows.
```
```
#### mlqa-translate-test.de
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 3.88 MB
- **Total amount of disk used:** 13.96 MB
An example of 'test' looks as follows.
```
```
#### mlqa-translate-test.es
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 3.92 MB
- **Total amount of disk used:** 13.99 MB
An example of 'test' looks as follows.
```
```
#### mlqa-translate-test.hi
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 4.61 MB
- **Total amount of disk used:** 14.68 MB
An example of 'test' looks as follows.
```
```
#### mlqa-translate-test.vi
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 6.00 MB
- **Total amount of disk used:** 16.07 MB
An example of 'test' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### mlqa-translate-test.ar
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
#### mlqa-translate-test.de
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
#### mlqa-translate-test.es
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
#### mlqa-translate-test.hi
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
#### mlqa-translate-test.vi
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
### Data Splits
| name |test|
|----------------------|---:|
|mlqa-translate-test.ar|5335|
|mlqa-translate-test.de|4517|
|mlqa-translate-test.es|5253|
|mlqa-translate-test.hi|4918|
|mlqa-translate-test.vi|5495|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{lewis2019mlqa,
title = {MLQA: Evaluating Cross-lingual Extractive Question Answering},
author = {Lewis, Patrick and Oguz, Barlas and Rinott, Ruty and Riedel, Sebastian and Schwenk, Holger},
journal = {arXiv preprint arXiv:1910.07475},
year = 2019,
eid = {arXiv: 1910.07475}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten), [@M-Salti](https://github.com/M-Salti), [@lewtun](https://github.com/lewtun), [@thomwolf](https://github.com/thomwolf), [@mariamabarham](https://github.com/mariamabarham), [@lhoestq](https://github.com/lhoestq) for adding this dataset.
---
license: apache-2.0
---
| TheTung/mlqa | [
"task_categories:question-answering",
"task_ids:extractive-qa",
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"multilinguality:multilingual",
"size_categories:10K<n<100K",
"source_datasets:original",
"language:en",
"language:de",
"language:es",
"language:ar",
"language:zh",
"language:vi",
"language:hi",
"license:cc-by-sa-3.0",
"region:us"
] | 2024-01-22T02:35:45+00:00 | {"annotations_creators": ["crowdsourced"], "language_creators": ["crowdsourced"], "language": ["en", "de", "es", "ar", "zh", "vi", "hi"], "license": ["cc-by-sa-3.0"], "multilinguality": ["multilingual"], "size_categories": ["10K<n<100K"], "source_datasets": ["original"], "task_categories": ["question-answering"], "task_ids": ["extractive-qa"], "paperswithcode_id": "mlqa", "pretty_name": "MLQA (MultiLingual Question Answering)", "dataset_info": [{"config_name": "mlqa-translate-train.ar", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 101227245, "num_examples": 78058}, {"name": "validation", "num_bytes": 13144332, "num_examples": 9512}], "download_size": 63364123, "dataset_size": 114371577}, {"config_name": "mlqa-translate-train.de", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 77996825, "num_examples": 80069}, {"name": "validation", "num_bytes": 10322113, "num_examples": 9927}], "download_size": 63364123, "dataset_size": 88318938}, {"config_name": "mlqa-translate-train.vi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 97387431, "num_examples": 84816}, {"name": "validation", "num_bytes": 12731112, "num_examples": 10356}], "download_size": 63364123, "dataset_size": 110118543}, {"config_name": "mlqa-translate-train.zh", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 55143547, "num_examples": 76285}, {"name": "validation", "num_bytes": 7418070, "num_examples": 9568}], "download_size": 63364123, "dataset_size": 62561617}, {"config_name": "mlqa-translate-train.es", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80789653, "num_examples": 81810}, {"name": "validation", "num_bytes": 10718376, "num_examples": 10123}], "download_size": 63364123, "dataset_size": 91508029}, {"config_name": "mlqa-translate-train.hi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 168117671, "num_examples": 82451}, {"name": "validation", "num_bytes": 22422152, "num_examples": 10253}], "download_size": 63364123, "dataset_size": 190539823}, {"config_name": "mlqa-translate-test.ar", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 5484467, "num_examples": 5335}], "download_size": 10075488, "dataset_size": 5484467}, {"config_name": "mlqa-translate-test.de", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3884332, "num_examples": 4517}], "download_size": 10075488, "dataset_size": 3884332}, {"config_name": "mlqa-translate-test.vi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 5998327, "num_examples": 5495}], "download_size": 10075488, "dataset_size": 5998327}, {"config_name": "mlqa-translate-test.zh", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4831704, "num_examples": 5137}], "download_size": 10075488, "dataset_size": 4831704}, {"config_name": "mlqa-translate-test.es", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3916758, "num_examples": 5253}], "download_size": 10075488, "dataset_size": 3916758}, {"config_name": "mlqa-translate-test.hi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4608811, "num_examples": 4918}], "download_size": 10075488, "dataset_size": 4608811}, {"config_name": "mlqa.ar.ar", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 8216837, "num_examples": 5335}, {"name": "validation", "num_bytes": 808830, "num_examples": 517}], "download_size": 75719050, "dataset_size": 9025667}, {"config_name": "mlqa.ar.de", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2132247, "num_examples": 1649}, {"name": "validation", "num_bytes": 358554, "num_examples": 207}], "download_size": 75719050, "dataset_size": 2490801}, {"config_name": "mlqa.ar.vi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3235363, "num_examples": 2047}, {"name": "validation", "num_bytes": 283834, "num_examples": 163}], "download_size": 75719050, "dataset_size": 3519197}, {"config_name": "mlqa.ar.zh", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3175660, "num_examples": 1912}, {"name": "validation", "num_bytes": 334016, "num_examples": 188}], "download_size": 75719050, "dataset_size": 3509676}, {"config_name": "mlqa.ar.en", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 8074057, "num_examples": 5335}, {"name": "validation", "num_bytes": 794775, "num_examples": 517}], "download_size": 75719050, "dataset_size": 8868832}, {"config_name": "mlqa.ar.es", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2981237, "num_examples": 1978}, {"name": "validation", "num_bytes": 223188, "num_examples": 161}], "download_size": 75719050, "dataset_size": 3204425}, {"config_name": "mlqa.ar.hi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2993225, "num_examples": 1831}, {"name": "validation", "num_bytes": 276727, "num_examples": 186}], "download_size": 75719050, "dataset_size": 3269952}, {"config_name": "mlqa.de.ar", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1587005, "num_examples": 1649}, {"name": "validation", "num_bytes": 195822, "num_examples": 207}], "download_size": 75719050, "dataset_size": 1782827}, {"config_name": "mlqa.de.de", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4274496, "num_examples": 4517}, {"name": "validation", "num_bytes": 477366, "num_examples": 512}], "download_size": 75719050, "dataset_size": 4751862}, {"config_name": "mlqa.de.vi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1654540, "num_examples": 1675}, {"name": "validation", "num_bytes": 211985, "num_examples": 182}], "download_size": 75719050, "dataset_size": 1866525}, {"config_name": "mlqa.de.zh", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1645937, "num_examples": 1621}, {"name": "validation", "num_bytes": 180114, "num_examples": 190}], "download_size": 75719050, "dataset_size": 1826051}, {"config_name": "mlqa.de.en", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4251153, "num_examples": 4517}, {"name": "validation", "num_bytes": 474863, "num_examples": 512}], "download_size": 75719050, "dataset_size": 4726016}, {"config_name": "mlqa.de.es", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1678176, "num_examples": 1776}, {"name": "validation", "num_bytes": 166193, "num_examples": 196}], "download_size": 75719050, "dataset_size": 1844369}, {"config_name": "mlqa.de.hi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1343983, "num_examples": 1430}, {"name": "validation", "num_bytes": 150679, "num_examples": 163}], "download_size": 75719050, "dataset_size": 1494662}, {"config_name": "mlqa.vi.ar", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3164094, "num_examples": 2047}, {"name": "validation", "num_bytes": 226724, "num_examples": 163}], "download_size": 75719050, "dataset_size": 3390818}, {"config_name": "mlqa.vi.de", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2189315, "num_examples": 1675}, {"name": "validation", "num_bytes": 272794, "num_examples": 182}], "download_size": 75719050, "dataset_size": 2462109}, {"config_name": "mlqa.vi.vi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 7807045, "num_examples": 5495}, {"name": "validation", "num_bytes": 715291, "num_examples": 511}], "download_size": 75719050, "dataset_size": 8522336}, {"config_name": "mlqa.vi.zh", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2947458, "num_examples": 1943}, {"name": "validation", "num_bytes": 265154, "num_examples": 184}], "download_size": 75719050, "dataset_size": 3212612}, {"config_name": "mlqa.vi.en", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 7727204, "num_examples": 5495}, {"name": "validation", "num_bytes": 707925, "num_examples": 511}], "download_size": 75719050, "dataset_size": 8435129}, {"config_name": "mlqa.vi.es", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2822481, "num_examples": 2018}, {"name": "validation", "num_bytes": 279235, "num_examples": 189}], "download_size": 75719050, "dataset_size": 3101716}, {"config_name": "mlqa.vi.hi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2738045, "num_examples": 1947}, {"name": "validation", "num_bytes": 251470, "num_examples": 177}], "download_size": 75719050, "dataset_size": 2989515}, {"config_name": "mlqa.zh.ar", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1697005, "num_examples": 1912}, {"name": "validation", "num_bytes": 171743, "num_examples": 188}], "download_size": 75719050, "dataset_size": 1868748}, {"config_name": "mlqa.zh.de", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1356268, "num_examples": 1621}, {"name": "validation", "num_bytes": 170686, "num_examples": 190}], "download_size": 75719050, "dataset_size": 1526954}, {"config_name": "mlqa.zh.vi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1770535, "num_examples": 1943}, {"name": "validation", "num_bytes": 169651, "num_examples": 184}], "download_size": 75719050, "dataset_size": 1940186}, {"config_name": "mlqa.zh.zh", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4324740, "num_examples": 5137}, {"name": "validation", "num_bytes": 433960, "num_examples": 504}], "download_size": 75719050, "dataset_size": 4758700}, {"config_name": "mlqa.zh.en", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4353361, "num_examples": 5137}, {"name": "validation", "num_bytes": 437016, "num_examples": 504}], "download_size": 75719050, "dataset_size": 4790377}, {"config_name": "mlqa.zh.es", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1697983, "num_examples": 1947}, {"name": "validation", "num_bytes": 134693, "num_examples": 161}], "download_size": 75719050, "dataset_size": 1832676}, {"config_name": "mlqa.zh.hi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1547159, "num_examples": 1767}, {"name": "validation", "num_bytes": 180928, "num_examples": 189}], "download_size": 75719050, "dataset_size": 1728087}, {"config_name": "mlqa.en.ar", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 6641971, "num_examples": 5335}, {"name": "validation", "num_bytes": 621075, "num_examples": 517}], "download_size": 75719050, "dataset_size": 7263046}, {"config_name": "mlqa.en.de", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4966262, "num_examples": 4517}, {"name": "validation", "num_bytes": 584725, "num_examples": 512}], "download_size": 75719050, "dataset_size": 5550987}, {"config_name": "mlqa.en.vi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 6958087, "num_examples": 5495}, {"name": "validation", "num_bytes": 631268, "num_examples": 511}], "download_size": 75719050, "dataset_size": 7589355}, {"config_name": "mlqa.en.zh", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 6441614, "num_examples": 5137}, {"name": "validation", "num_bytes": 598772, "num_examples": 504}], "download_size": 75719050, "dataset_size": 7040386}, {"config_name": "mlqa.en.en", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 13787522, "num_examples": 11590}, {"name": "validation", "num_bytes": 1307399, "num_examples": 1148}], "download_size": 75719050, "dataset_size": 15094921}, {"config_name": "mlqa.en.es", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 6074990, "num_examples": 5253}, {"name": "validation", "num_bytes": 545657, "num_examples": 500}], "download_size": 75719050, "dataset_size": 6620647}, {"config_name": "mlqa.en.hi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 6293785, "num_examples": 4918}, {"name": "validation", "num_bytes": 614223, "num_examples": 507}], "download_size": 75719050, "dataset_size": 6908008}, {"config_name": "mlqa.es.ar", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1696778, "num_examples": 1978}, {"name": "validation", "num_bytes": 145105, "num_examples": 161}], "download_size": 75719050, "dataset_size": 1841883}, {"config_name": "mlqa.es.de", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1361983, "num_examples": 1776}, {"name": "validation", "num_bytes": 139968, "num_examples": 196}], "download_size": 75719050, "dataset_size": 1501951}, {"config_name": "mlqa.es.vi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1707141, "num_examples": 2018}, {"name": "validation", "num_bytes": 172801, "num_examples": 189}], "download_size": 75719050, "dataset_size": 1879942}, {"config_name": "mlqa.es.zh", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1635294, "num_examples": 1947}, {"name": "validation", "num_bytes": 122829, "num_examples": 161}], "download_size": 75719050, "dataset_size": 1758123}, {"config_name": "mlqa.es.en", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4249431, "num_examples": 5253}, {"name": "validation", "num_bytes": 408169, "num_examples": 500}], "download_size": 75719050, "dataset_size": 4657600}, {"config_name": "mlqa.es.es", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4281273, "num_examples": 5253}, {"name": "validation", "num_bytes": 411196, "num_examples": 500}], "download_size": 75719050, "dataset_size": 4692469}, {"config_name": "mlqa.es.hi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1489611, "num_examples": 1723}, {"name": "validation", "num_bytes": 178003, "num_examples": 187}], "download_size": 75719050, "dataset_size": 1667614}, {"config_name": "mlqa.hi.ar", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4374373, "num_examples": 1831}, {"name": "validation", "num_bytes": 402817, "num_examples": 186}], "download_size": 75719050, "dataset_size": 4777190}, {"config_name": "mlqa.hi.de", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2961556, "num_examples": 1430}, {"name": "validation", "num_bytes": 294325, "num_examples": 163}], "download_size": 75719050, "dataset_size": 3255881}, {"config_name": "mlqa.hi.vi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4664436, "num_examples": 1947}, {"name": "validation", "num_bytes": 411654, "num_examples": 177}], "download_size": 75719050, "dataset_size": 5076090}, {"config_name": "mlqa.hi.zh", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4281309, "num_examples": 1767}, {"name": "validation", "num_bytes": 416192, "num_examples": 189}], "download_size": 75719050, "dataset_size": 4697501}, {"config_name": "mlqa.hi.en", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 11245629, "num_examples": 4918}, {"name": "validation", "num_bytes": 1076115, "num_examples": 507}], "download_size": 75719050, "dataset_size": 12321744}, {"config_name": "mlqa.hi.es", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3789337, "num_examples": 1723}, {"name": "validation", "num_bytes": 412469, "num_examples": 187}], "download_size": 75719050, "dataset_size": 4201806}, {"config_name": "mlqa.hi.hi", "features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "answer_start", "dtype": "int32"}, {"name": "text", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 11606982, "num_examples": 4918}, {"name": "validation", "num_bytes": 1115055, "num_examples": 507}], "download_size": 75719050, "dataset_size": 12722037}]} | 2024-01-22T02:42:54+00:00 | [] | [
"en",
"de",
"es",
"ar",
"zh",
"vi",
"hi"
] | TAGS
#task_categories-question-answering #task_ids-extractive-qa #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-multilingual #size_categories-10K<n<100K #source_datasets-original #language-English #language-German #language-Spanish #language-Arabic #language-Chinese #language-Vietnamese #language-Hindi #license-cc-by-sa-3.0 #region-us
| Dataset Card for "mlqa"
=======================
Table of Contents
-----------------
* Dataset Description
+ Dataset Summary
+ Supported Tasks and Leaderboards
+ Languages
* Dataset Structure
+ Data Instances
+ Data Fields
+ Data Splits
* Dataset Creation
+ Curation Rationale
+ Source Data
+ Annotations
+ Personal and Sensitive Information
* Considerations for Using the Data
+ Social Impact of Dataset
+ Discussion of Biases
+ Other Known Limitations
* Additional Information
+ Dataset Curators
+ Licensing Information
+ Citation Information
+ Contributions
Dataset Description
-------------------
* Homepage: URL
* Repository:
* Paper:
* Point of Contact:
* Size of downloaded dataset files: 4.15 GB
* Size of the generated dataset: 910.01 MB
* Total amount of disk used: 5.06 GB
### Dataset Summary
```
MLQA (MultiLingual Question Answering) is a benchmark dataset for evaluating cross-lingual question answering performance.
MLQA consists of over 5K extractive QA instances (12K in English) in SQuAD format in seven languages - English, Arabic,
German, Spanish, Hindi, Vietnamese and Simplified Chinese. MLQA is highly parallel, with QA instances parallel between
4 different languages on average.
```
### Supported Tasks and Leaderboards
### Languages
MLQA contains QA instances in 7 languages, English, Arabic, German, Spanish, Hindi, Vietnamese and Simplified Chinese.
Dataset Structure
-----------------
### Data Instances
#### URL
* Size of downloaded dataset files: 10.08 MB
* Size of the generated dataset: 5.48 MB
* Total amount of disk used: 15.56 MB
An example of 'test' looks as follows.
#### URL
* Size of downloaded dataset files: 10.08 MB
* Size of the generated dataset: 3.88 MB
* Total amount of disk used: 13.96 MB
An example of 'test' looks as follows.
#### URL
* Size of downloaded dataset files: 10.08 MB
* Size of the generated dataset: 3.92 MB
* Total amount of disk used: 13.99 MB
An example of 'test' looks as follows.
#### URL
* Size of downloaded dataset files: 10.08 MB
* Size of the generated dataset: 4.61 MB
* Total amount of disk used: 14.68 MB
An example of 'test' looks as follows.
#### URL
* Size of downloaded dataset files: 10.08 MB
* Size of the generated dataset: 6.00 MB
* Total amount of disk used: 16.07 MB
An example of 'test' looks as follows.
### Data Fields
The data fields are the same among all splits.
#### URL
* 'context': a 'string' feature.
* 'question': a 'string' feature.
* 'answers': a dictionary feature containing:
+ 'answer\_start': a 'int32' feature.
+ 'text': a 'string' feature.
* 'id': a 'string' feature.
#### URL
* 'context': a 'string' feature.
* 'question': a 'string' feature.
* 'answers': a dictionary feature containing:
+ 'answer\_start': a 'int32' feature.
+ 'text': a 'string' feature.
* 'id': a 'string' feature.
#### URL
* 'context': a 'string' feature.
* 'question': a 'string' feature.
* 'answers': a dictionary feature containing:
+ 'answer\_start': a 'int32' feature.
+ 'text': a 'string' feature.
* 'id': a 'string' feature.
#### URL
* 'context': a 'string' feature.
* 'question': a 'string' feature.
* 'answers': a dictionary feature containing:
+ 'answer\_start': a 'int32' feature.
+ 'text': a 'string' feature.
* 'id': a 'string' feature.
#### URL
* 'context': a 'string' feature.
* 'question': a 'string' feature.
* 'answers': a dictionary feature containing:
+ 'answer\_start': a 'int32' feature.
+ 'text': a 'string' feature.
* 'id': a 'string' feature.
### Data Splits
Dataset Creation
----------------
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
Considerations for Using the Data
---------------------------------
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
Additional Information
----------------------
### Dataset Curators
### Licensing Information
### Contributions
Thanks to @patrickvonplaten, @M-Salti, @lewtun, @thomwolf, @mariamabarham, @lhoestq for adding this dataset.
---
license: apache-2.0
-------------------
| [
"### Dataset Summary\n\n\n\n```\nMLQA (MultiLingual Question Answering) is a benchmark dataset for evaluating cross-lingual question answering performance.\nMLQA consists of over 5K extractive QA instances (12K in English) in SQuAD format in seven languages - English, Arabic,\nGerman, Spanish, Hindi, Vietnamese and Simplified Chinese. MLQA is highly parallel, with QA instances parallel between\n4 different languages on average.\n\n```",
"### Supported Tasks and Leaderboards",
"### Languages\n\n\nMLQA contains QA instances in 7 languages, English, Arabic, German, Spanish, Hindi, Vietnamese and Simplified Chinese.\n\n\nDataset Structure\n-----------------",
"### Data Instances",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 5.48 MB\n* Total amount of disk used: 15.56 MB\n\n\nAn example of 'test' looks as follows.",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 3.88 MB\n* Total amount of disk used: 13.96 MB\n\n\nAn example of 'test' looks as follows.",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 3.92 MB\n* Total amount of disk used: 13.99 MB\n\n\nAn example of 'test' looks as follows.",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 4.61 MB\n* Total amount of disk used: 14.68 MB\n\n\nAn example of 'test' looks as follows.",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 6.00 MB\n* Total amount of disk used: 16.07 MB\n\n\nAn example of 'test' looks as follows.",
"### Data Fields\n\n\nThe data fields are the same among all splits.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"### Data Splits\n\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information",
"### Contributions\n\n\nThanks to @patrickvonplaten, @M-Salti, @lewtun, @thomwolf, @mariamabarham, @lhoestq for adding this dataset.\n\n\n\n\n---\n\n\nlicense: apache-2.0\n-------------------"
] | [
"TAGS\n#task_categories-question-answering #task_ids-extractive-qa #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-multilingual #size_categories-10K<n<100K #source_datasets-original #language-English #language-German #language-Spanish #language-Arabic #language-Chinese #language-Vietnamese #language-Hindi #license-cc-by-sa-3.0 #region-us \n",
"### Dataset Summary\n\n\n\n```\nMLQA (MultiLingual Question Answering) is a benchmark dataset for evaluating cross-lingual question answering performance.\nMLQA consists of over 5K extractive QA instances (12K in English) in SQuAD format in seven languages - English, Arabic,\nGerman, Spanish, Hindi, Vietnamese and Simplified Chinese. MLQA is highly parallel, with QA instances parallel between\n4 different languages on average.\n\n```",
"### Supported Tasks and Leaderboards",
"### Languages\n\n\nMLQA contains QA instances in 7 languages, English, Arabic, German, Spanish, Hindi, Vietnamese and Simplified Chinese.\n\n\nDataset Structure\n-----------------",
"### Data Instances",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 5.48 MB\n* Total amount of disk used: 15.56 MB\n\n\nAn example of 'test' looks as follows.",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 3.88 MB\n* Total amount of disk used: 13.96 MB\n\n\nAn example of 'test' looks as follows.",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 3.92 MB\n* Total amount of disk used: 13.99 MB\n\n\nAn example of 'test' looks as follows.",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 4.61 MB\n* Total amount of disk used: 14.68 MB\n\n\nAn example of 'test' looks as follows.",
"#### URL\n\n\n* Size of downloaded dataset files: 10.08 MB\n* Size of the generated dataset: 6.00 MB\n* Total amount of disk used: 16.07 MB\n\n\nAn example of 'test' looks as follows.",
"### Data Fields\n\n\nThe data fields are the same among all splits.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"#### URL\n\n\n* 'context': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers': a dictionary feature containing:\n\t+ 'answer\\_start': a 'int32' feature.\n\t+ 'text': a 'string' feature.\n* 'id': a 'string' feature.",
"### Data Splits\n\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information",
"### Contributions\n\n\nThanks to @patrickvonplaten, @M-Salti, @lewtun, @thomwolf, @mariamabarham, @lhoestq for adding this dataset.\n\n\n\n\n---\n\n\nlicense: apache-2.0\n-------------------"
] |
02fda995fd57f3c2c7946632f3c73fa80728c204 |
# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r512
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Stellaris-internlm2-20b-r512](https://huggingface.co/Weyaxi/Stellaris-internlm2-20b-r512) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r512",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T02:33:44.720538](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r512/blob/main/results_2024-01-22T02-33-44.720538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6566278372965977,
"acc_stderr": 0.03172928992616415,
"acc_norm": 0.6659506224432014,
"acc_norm_stderr": 0.03244356975655913,
"mc1": 0.3157894736842105,
"mc1_stderr": 0.01627228795791691,
"mc2": 0.4950678335769212,
"mc2_stderr": 0.015192417727874554
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472437,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038076
},
"harness|hellaswag|10": {
"acc": 0.6601274646484764,
"acc_stderr": 0.00472697660713081,
"acc_norm": 0.8399721171081458,
"acc_norm_stderr": 0.0036588262081016093
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343603,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741713,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745647,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745647
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167329,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167329
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.832258064516129,
"acc_stderr": 0.021255464065371325,
"acc_norm": 0.832258064516129,
"acc_norm_stderr": 0.021255464065371325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5911330049261084,
"acc_stderr": 0.03459058815883233,
"acc_norm": 0.5911330049261084,
"acc_norm_stderr": 0.03459058815883233
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.026869716187429914,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.026869716187429914
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.02340092891831049,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.02340092891831049
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061627,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061627
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.02921354941437216,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.02921354941437216
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8642201834862385,
"acc_stderr": 0.014686907556340013,
"acc_norm": 0.8642201834862385,
"acc_norm_stderr": 0.014686907556340013
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508773,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508773
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040697,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594209,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594209
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.016547887997416112,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.016547887997416112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7556270096463023,
"acc_stderr": 0.024406162094668886,
"acc_norm": 0.7556270096463023,
"acc_norm_stderr": 0.024406162094668886
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5071707953063885,
"acc_stderr": 0.012768922739553303,
"acc_norm": 0.5071707953063885,
"acc_norm_stderr": 0.012768922739553303
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144714,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144714
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.02671143055553841,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.02671143055553841
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3157894736842105,
"mc1_stderr": 0.01627228795791691,
"mc2": 0.4950678335769212,
"mc2_stderr": 0.015192417727874554
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.1463229719484458,
"acc_stderr": 0.009735210557785257
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r512 | [
"region:us"
] | 2024-01-22T02:35:49+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Stellaris-internlm2-20b-r512", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Stellaris-internlm2-20b-r512](https://huggingface.co/Weyaxi/Stellaris-internlm2-20b-r512) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r512\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T02:33:44.720538](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Stellaris-internlm2-20b-r512/blob/main/results_2024-01-22T02-33-44.720538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6566278372965977,\n \"acc_stderr\": 0.03172928992616415,\n \"acc_norm\": 0.6659506224432014,\n \"acc_norm_stderr\": 0.03244356975655913,\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.01627228795791691,\n \"mc2\": 0.4950678335769212,\n \"mc2_stderr\": 0.015192417727874554\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472437,\n \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038076\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6601274646484764,\n \"acc_stderr\": 0.00472697660713081,\n \"acc_norm\": 0.8399721171081458,\n \"acc_norm_stderr\": 0.0036588262081016093\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343603,\n \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343603\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741713,\n \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745647,\n \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745647\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266237,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266237\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167329,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167329\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.832258064516129,\n \"acc_stderr\": 0.021255464065371325,\n \"acc_norm\": 0.832258064516129,\n \"acc_norm_stderr\": 0.021255464065371325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5911330049261084,\n \"acc_stderr\": 0.03459058815883233,\n \"acc_norm\": 0.5911330049261084,\n \"acc_norm_stderr\": 0.03459058815883233\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8282828282828283,\n \"acc_stderr\": 0.026869716187429914,\n \"acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.026869716187429914\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.02340092891831049,\n \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.02340092891831049\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061627,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061627\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437216,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437216\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8642201834862385,\n \"acc_stderr\": 0.014686907556340013,\n \"acc_norm\": 0.8642201834862385,\n \"acc_norm_stderr\": 0.014686907556340013\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508773,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508773\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.02126271940040697,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.02126271940040697\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n \"acc_stderr\": 0.014317653708594209,\n \"acc_norm\": 0.7994891443167306,\n \"acc_norm_stderr\": 0.014317653708594209\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.016547887997416112,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.016547887997416112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n \"acc_stderr\": 0.024406162094668886,\n \"acc_norm\": 0.7556270096463023,\n \"acc_norm_stderr\": 0.024406162094668886\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5071707953063885,\n \"acc_stderr\": 0.012768922739553303,\n \"acc_norm\": 0.5071707953063885,\n \"acc_norm_stderr\": 0.012768922739553303\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144714,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144714\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.02671143055553841,\n \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.02671143055553841\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.01627228795791691,\n \"mc2\": 0.4950678335769212,\n \"mc2_stderr\": 0.015192417727874554\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1463229719484458,\n \"acc_stderr\": 0.009735210557785257\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Stellaris-internlm2-20b-r512", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T02-33-44.720538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["**/details_harness|winogrande|5_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T02-33-44.720538.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T02_33_44.720538", "path": ["results_2024-01-22T02-33-44.720538.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T02-33-44.720538.parquet"]}]}]} | 2024-01-22T02:36:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r512
Dataset automatically created during the evaluation run of model Weyaxi/Stellaris-internlm2-20b-r512 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T02:33:44.720538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r512\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Stellaris-internlm2-20b-r512 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:33:44.720538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Stellaris-internlm2-20b-r512\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Stellaris-internlm2-20b-r512 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T02:33:44.720538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9a8677238e0b6ec76e94a522152e20a790f93405 | arXiv subset from MathPile_Commercial. Filtered for samples that are **8192 or less** in tokens length based on the Mistral tokenizer. | vilm/MathPile-arXiv-medium | [
"region:us"
] | 2024-01-22T02:38:36+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 833254840, "num_examples": 48005}], "download_size": 402095355, "dataset_size": 833254840}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-22T03:49:38+00:00 | [] | [] | TAGS
#region-us
| arXiv subset from MathPile_Commercial. Filtered for samples that are 8192 or less in tokens length based on the Mistral tokenizer. | [] | [
"TAGS\n#region-us \n"
] |
b4b6ca8a7aad867295cfe8d5b245f09a26d6483f |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | tanibt/crowne_plaza | [
"region:us"
] | 2024-01-22T02:41:06+00:00 | {} | 2024-01-22T02:42:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
89fe7cc1ddde1a0d86f30fd2da5af3f6aba6cf61 |
# BEE-spoke-data/sbert-paraphrase-data
Paraphrase data from [sentence-transformers](https://www.sbert.net/examples/training/paraphrases/README.html#datasets)
## contents
### default
| No. | Filename |
|-----|--------------------------------------------------------------|
| 1 | yahoo_answers_title_question.jsonl |
| 2 | squad_pairs.jsonl |
| 3 | eli5_question_answer.jsonl |
| 4 | WikiAnswers_pairs.jsonl |
| 5 | stackexchange_duplicate_questions_title_title.jsonl |
| 6 | TriviaQA_pairs.jsonl |
| 7 | stackexchange_duplicate_questions.jsonl |
| 8 | sentence-compression.jsonl |
| 9 | AllNLI_2cols.jsonl |
| 10 | NQ-train_pairs.jsonl |
| 11 | searchQA_question_top5_snippets_merged.jsonl |
| 12 | stackexchange_duplicate_questions_title-body_title-body.jsonl|
| 13 | SimpleWiki.jsonl |
| 14 | yahoo_answers_question_answer.jsonl |
| 15 | gooaq_pairs.jsonl |
| 16 | quora_duplicates.jsonl |
| 17 | stackexchange_duplicate_questions_body_body.jsonl |
| 18 | yahoo_answers_title_answer.jsonl |
| 19 | S2ORC_citation_pairs.jsonl |
| 20 | stackexchange_title_body_small.jsonl |
| 21 | fever_train.jsonl |
| 22 | altlex.jsonl |
| 23 | amazon-qa-train-pairs.jsonl |
| 24 | codesearchnet.jsonl |
| 25 | searchQA_question_topSnippet.jsonl |
### triplets
| No. | Filename |
|-----|--------------------------------------|
| 1 | AllNLI.jsonl |
| 2 | specter_train_triples.jsonl |
| 3 | quora_duplicates_triplets.jsonl |
| BEE-spoke-data/sbert-paraphrase-data | [
"task_categories:sentence-similarity",
"size_categories:100M<n<1B",
"language:en",
"license:odc-by",
"region:us"
] | 2024-01-22T02:59:19+00:00 | {"language": ["en"], "license": "odc-by", "size_categories": ["100M<n<1B"], "task_categories": ["sentence-similarity"], "dataset_info": [{"config_name": "default", "features": [{"name": "0", "dtype": "string"}, {"name": "1", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23655222164, "num_examples": 142947230}], "download_size": 15494823340, "dataset_size": 23655222164}, {"config_name": "msmarco-triplets-flat", "features": [{"name": "text", "dtype": "string"}, {"name": "positive", "dtype": "string"}, {"name": "negative", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 358771844, "num_examples": 485469}], "download_size": 233344152, "dataset_size": 358771844}, {"config_name": "pairs-100word", "features": [{"name": "0", "dtype": "string"}, {"name": "1", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2317278084, "num_examples": 1611483}], "download_size": 1332475321, "dataset_size": 2317278084}, {"config_name": "triplets", "features": [{"name": "text", "dtype": "string"}, {"name": "positive", "dtype": "string"}, {"name": "negative", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 222068225, "num_examples": 1064993}], "download_size": 106956648, "dataset_size": 222068225}, {"config_name": "triplets-expanded", "features": [{"name": "text", "dtype": "string"}, {"name": "positive", "dtype": "string"}, {"name": "negative", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1028568107, "num_examples": 1660962}], "download_size": 693685496, "dataset_size": 1028568107}], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}, {"config_name": "msmarco-triplets-flat", "data_files": [{"split": "train", "path": "msmarco-triplets-flat/train-*"}]}, {"config_name": "pairs-100word", "data_files": [{"split": "train", "path": "pairs-100word/train-*"}]}, {"config_name": "triplets", "data_files": [{"split": "train", "path": "triplets/train-*"}]}, {"config_name": "triplets-expanded", "data_files": [{"split": "train", "path": "triplets-expanded/train-*"}]}]} | 2024-01-30T08:16:26+00:00 | [] | [
"en"
] | TAGS
#task_categories-sentence-similarity #size_categories-100M<n<1B #language-English #license-odc-by #region-us
| BEE-spoke-data/sbert-paraphrase-data
====================================
Paraphrase data from sentence-transformers
contents
--------
### default
### triplets
| [
"### default",
"### triplets"
] | [
"TAGS\n#task_categories-sentence-similarity #size_categories-100M<n<1B #language-English #license-odc-by #region-us \n",
"### default",
"### triplets"
] |
2ad986acc1ec62fb4a94171acc43f4fdd5bfde53 |
# Dataset Description
We are releasing under the CC-BY licence a new large-scale dataset for Automatic Symptom Detection (ASD) and Automatic Diagnosis (AD) systems in the medical domain. The dataset contains patients synthesized using a proprietary medical knowledge base and a commercial rule-based AD system. Patients in the dataset are characterized by their socio-demographic data, a pathology they are suffering from, a set of symptoms and antecedents related to this pathology, and a differential diagnosis. The symptoms and antecedents can be binary, categorical and multi-choice, with the potential of leading to more efficient and natural interactions between ASD/AD systems and patients. To the best of our knowledge, this is the first large-scale dataset that includes the differential diagnosis, and non-binary symptoms and antecedents.
**Note**: We use evidence as a general term to refer to a symptom or an antecedent.
This directory contains the following files:
- **release_evidences.json**: a JSON file describing all possible evidences considered in the dataset.
- **release_conditions.json**: a JSON file describing all pathologies considered in the dataset.
- **release_train_patients.zip**: a CSV file containing the patients of the training set.
- **release_validate_patients.zip**: a CSV file containing the patients of the validation set.
- **release_test_patients.zip**: a CSV file containing the patients of the test set.
## Evidence Description
Each evidence in the `release_evidences.json` file is described using the following entries:
- **name**: name of the evidence.
- **code_question**: a code allowing to identify which evidences are related. Evidences having the same `code_question` form a group of related symptoms. The value of the `code_question` refers to the evidence that need to be simulated/activated for the other members of the group to be eventually simulated.
- **question_fr**: the query, in French, associated to the evidence.
- **question_en**: the query, in English, associated to the evidence.
- **is_antecedent**: a flag indicating whether the evidence is an antecedent or a symptom.
- **data_type**: the type of evidence. We use `B` for binary, `C` for categorical, and `M` for multi-choice evidences.
- **default_value**: the default value of the evidence. If this value is used to characterize the evidence, then it is as if the evidence was not synthesized.
- **possible-values**: the possible values for the evidences. Only valid for categorical and multi-choice evidences.
- **value_meaning**: The meaning, in French and English, of each code that is part of the `possible-values` field. Only valid for categorical and multi-choice evidences.
## Pathology Description
The file `release_conditions.json` contains information about the pathologies that patients in the datasets may suffer from. Each pathology has the following attributes:
- **condition_name**: name of the pathology.
- **cond-name-fr**: name of the pathology in French.
- **cond-name-eng**: name of the pathology in English.
- **icd10-id**: ICD-10 code of the pathology.
- **severity**: the severity associated with the pathology. The lower the more severe.
- **symptoms**: data structure describing the set of symptoms characterizing the pathology. Each symptom is represented by its corresponding `name` entry in the `release_evidences.json` file.
- **antecedents**: data structure describing the set of antecedents characterizing the pathology. Each antecedent is represented by its corresponding `name` entry in the `release_evidences.json` file.
## Patient Description
Each patient in each of the 3 sets has the following attributes:
- **AGE**: the age of the synthesized patient.
- **SEX**: the sex of the synthesized patient.
- **PATHOLOGY**: name of the ground truth pathology (`condition_name` property in the `release_conditions.json` file) that the synthesized patient is suffering from.
- **EVIDENCES**: list of evidences experienced by the patient. An evidence can either be binary, categorical or multi-choice. A categorical or multi-choice evidence is represented in the format `[evidence-name]_@_[evidence-value]` where [`evidence-name`] is the name of the evidence (`name` entry in the `release_evidences.json` file) and [`evidence-value`] is a value from the `possible-values` entry. Note that for a multi-choice evidence, it is possible to have several `[evidence-name]_@_[evidence-value]` items in the evidence list, with each item being associated with a different evidence value. A binary evidence is represented as `[evidence-name]`.
- **INITIAL_EVIDENCE**: the evidence provided by the patient to kick-start an interaction with an ASD/AD system. This is useful during model evaluation for a fair comparison of ASD/AD systems as they will all begin an interaction with a given patient from the same starting point. The initial evidence is randomly selected from the binary evidences found in the evidence list mentioned above (i.e., `EVIDENCES`) and it is part of this list.
- **DIFFERENTIAL_DIAGNOSIS**: The ground truth differential diagnosis for the patient. It is represented as a list of pairs of the form `[[patho_1, proba_1], [patho_2, proba_2], ...]` where `patho_i` is the pathology name (`condition_name` entry in the `release_conditions.json` file) and `proba_i` is its related probability.
## Note:
We hope this dataset will encourage future works for ASD and AD systems that consider the differential diagnosis and the severity of pathologies. It is important to keep in mind that this dataset is formed of synthetic patients and is meant for research purposes. Given the assumptions made during the generation process of this dataset, we would like to emphasize that the dataset should not be used to train and deploy a model prior to performing rigorous evaluations of the model performance and verifying that the system has proper coverage and representation of the population that it will interact with.
It is important to understand that the level of specificity, sensitivity and confidence that a physician will seek when evaluating a patient will be influenced by the clinical setting. The dataset was built for acute care and biased toward high mortality and morbidity pathologies. Physicians will tend to consider negative evidences as equally important in such a clinical context in order to evaluate high acuity diseases.
In the creation of the DDXPlus dataset, a small subset of the diseases was chosen to establish a baseline. Medical professionals have to consider this very important point when reviewing the results of models trained with this dataset, as the differential is considerably smaller. A smaller differential means less potential evidences to collect. It is thus essential to understand this point when we look at the differential produced and the evidence collected by a model based on this dataset.
For more information, please check our [paper](https://arxiv.org/abs/2205.09148). | aai530-group6/ddxplus | [
"task_categories:tabular-classification",
"task_ids:multi-class-classification",
"size_categories:1K<n<10K",
"source_datasets:original",
"language:en",
"license:cc-by-4.0",
"automatic-diagnosis",
"automatic-symptom-detection",
"differential-diagnosis",
"synthetic-patients",
"diseases",
"health-care",
"arxiv:2205.09148",
"region:us"
] | 2024-01-22T03:37:14+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "source_datasets": ["original"], "task_categories": ["tabular-classification"], "task_ids": ["multi-class-classification"], "paperswithcode_id": "ddxplus", "pretty_name": "DDXPlus", "license_link": "https://creativecommons.org/licenses/by/4.0/", "tags": ["automatic-diagnosis", "automatic-symptom-detection", "differential-diagnosis", "synthetic-patients", "diseases", "health-care"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "train.csv"}, {"split": "test", "path": "test.csv"}, {"split": "validate", "path": "validate.csv"}]}], "extra_gated_prompt": "By accessing this dataset, you agree to use it solely for research purposes and not for clinical decision-making.", "extra_gated_fields": {"Consent": "checkbox", "Purpose of use": {"type": "select", "options": ["Research", "Educational", {"label": "Other", "value": "other"}]}}, "train-eval-index": [{"config": "default", "task": "medical-diagnosis", "task_id": "binary-classification", "splits": {"train_split": "train", "eval_split": "validate"}, "col_mapping": {"AGE": "AGE", "SEX": "SEX", "PATHOLOGY": "PATHOLOGY", "EVIDENCES": "EVIDENCES", "INITIAL_EVIDENCE": "INITIAL_EVIDENCE", "DIFFERENTIAL_DIAGNOSIS": "DIFFERENTIAL_DIAGNOSIS"}, "metrics": [{"type": "accuracy", "name": "Accuracy"}, {"type": "f1", "name": "F1 Score"}]}]} | 2024-01-22T03:48:18+00:00 | [
"2205.09148"
] | [
"en"
] | TAGS
#task_categories-tabular-classification #task_ids-multi-class-classification #size_categories-1K<n<10K #source_datasets-original #language-English #license-cc-by-4.0 #automatic-diagnosis #automatic-symptom-detection #differential-diagnosis #synthetic-patients #diseases #health-care #arxiv-2205.09148 #region-us
|
# Dataset Description
We are releasing under the CC-BY licence a new large-scale dataset for Automatic Symptom Detection (ASD) and Automatic Diagnosis (AD) systems in the medical domain. The dataset contains patients synthesized using a proprietary medical knowledge base and a commercial rule-based AD system. Patients in the dataset are characterized by their socio-demographic data, a pathology they are suffering from, a set of symptoms and antecedents related to this pathology, and a differential diagnosis. The symptoms and antecedents can be binary, categorical and multi-choice, with the potential of leading to more efficient and natural interactions between ASD/AD systems and patients. To the best of our knowledge, this is the first large-scale dataset that includes the differential diagnosis, and non-binary symptoms and antecedents.
Note: We use evidence as a general term to refer to a symptom or an antecedent.
This directory contains the following files:
- release_evidences.json: a JSON file describing all possible evidences considered in the dataset.
- release_conditions.json: a JSON file describing all pathologies considered in the dataset.
- release_train_patients.zip: a CSV file containing the patients of the training set.
- release_validate_patients.zip: a CSV file containing the patients of the validation set.
- release_test_patients.zip: a CSV file containing the patients of the test set.
## Evidence Description
Each evidence in the 'release_evidences.json' file is described using the following entries:
- name: name of the evidence.
- code_question: a code allowing to identify which evidences are related. Evidences having the same 'code_question' form a group of related symptoms. The value of the 'code_question' refers to the evidence that need to be simulated/activated for the other members of the group to be eventually simulated.
- question_fr: the query, in French, associated to the evidence.
- question_en: the query, in English, associated to the evidence.
- is_antecedent: a flag indicating whether the evidence is an antecedent or a symptom.
- data_type: the type of evidence. We use 'B' for binary, 'C' for categorical, and 'M' for multi-choice evidences.
- default_value: the default value of the evidence. If this value is used to characterize the evidence, then it is as if the evidence was not synthesized.
- possible-values: the possible values for the evidences. Only valid for categorical and multi-choice evidences.
- value_meaning: The meaning, in French and English, of each code that is part of the 'possible-values' field. Only valid for categorical and multi-choice evidences.
## Pathology Description
The file 'release_conditions.json' contains information about the pathologies that patients in the datasets may suffer from. Each pathology has the following attributes:
- condition_name: name of the pathology.
- cond-name-fr: name of the pathology in French.
- cond-name-eng: name of the pathology in English.
- icd10-id: ICD-10 code of the pathology.
- severity: the severity associated with the pathology. The lower the more severe.
- symptoms: data structure describing the set of symptoms characterizing the pathology. Each symptom is represented by its corresponding 'name' entry in the 'release_evidences.json' file.
- antecedents: data structure describing the set of antecedents characterizing the pathology. Each antecedent is represented by its corresponding 'name' entry in the 'release_evidences.json' file.
## Patient Description
Each patient in each of the 3 sets has the following attributes:
- AGE: the age of the synthesized patient.
- SEX: the sex of the synthesized patient.
- PATHOLOGY: name of the ground truth pathology ('condition_name' property in the 'release_conditions.json' file) that the synthesized patient is suffering from.
- EVIDENCES: list of evidences experienced by the patient. An evidence can either be binary, categorical or multi-choice. A categorical or multi-choice evidence is represented in the format '[evidence-name]_@_[evidence-value]' where ['evidence-name'] is the name of the evidence ('name' entry in the 'release_evidences.json' file) and ['evidence-value'] is a value from the 'possible-values' entry. Note that for a multi-choice evidence, it is possible to have several '[evidence-name]_@_[evidence-value]' items in the evidence list, with each item being associated with a different evidence value. A binary evidence is represented as '[evidence-name]'.
- INITIAL_EVIDENCE: the evidence provided by the patient to kick-start an interaction with an ASD/AD system. This is useful during model evaluation for a fair comparison of ASD/AD systems as they will all begin an interaction with a given patient from the same starting point. The initial evidence is randomly selected from the binary evidences found in the evidence list mentioned above (i.e., 'EVIDENCES') and it is part of this list.
- DIFFERENTIAL_DIAGNOSIS: The ground truth differential diagnosis for the patient. It is represented as a list of pairs of the form '[[patho_1, proba_1], [patho_2, proba_2], ...]' where 'patho_i' is the pathology name ('condition_name' entry in the 'release_conditions.json' file) and 'proba_i' is its related probability.
## Note:
We hope this dataset will encourage future works for ASD and AD systems that consider the differential diagnosis and the severity of pathologies. It is important to keep in mind that this dataset is formed of synthetic patients and is meant for research purposes. Given the assumptions made during the generation process of this dataset, we would like to emphasize that the dataset should not be used to train and deploy a model prior to performing rigorous evaluations of the model performance and verifying that the system has proper coverage and representation of the population that it will interact with.
It is important to understand that the level of specificity, sensitivity and confidence that a physician will seek when evaluating a patient will be influenced by the clinical setting. The dataset was built for acute care and biased toward high mortality and morbidity pathologies. Physicians will tend to consider negative evidences as equally important in such a clinical context in order to evaluate high acuity diseases.
In the creation of the DDXPlus dataset, a small subset of the diseases was chosen to establish a baseline. Medical professionals have to consider this very important point when reviewing the results of models trained with this dataset, as the differential is considerably smaller. A smaller differential means less potential evidences to collect. It is thus essential to understand this point when we look at the differential produced and the evidence collected by a model based on this dataset.
For more information, please check our paper. | [
"# Dataset Description\n\nWe are releasing under the CC-BY licence a new large-scale dataset for Automatic Symptom Detection (ASD) and Automatic Diagnosis (AD) systems in the medical domain. The dataset contains patients synthesized using a proprietary medical knowledge base and a commercial rule-based AD system. Patients in the dataset are characterized by their socio-demographic data, a pathology they are suffering from, a set of symptoms and antecedents related to this pathology, and a differential diagnosis. The symptoms and antecedents can be binary, categorical and multi-choice, with the potential of leading to more efficient and natural interactions between ASD/AD systems and patients. To the best of our knowledge, this is the first large-scale dataset that includes the differential diagnosis, and non-binary symptoms and antecedents.\n\nNote: We use evidence as a general term to refer to a symptom or an antecedent.\n\nThis directory contains the following files:\n - release_evidences.json: a JSON file describing all possible evidences considered in the dataset.\n - release_conditions.json: a JSON file describing all pathologies considered in the dataset.\n - release_train_patients.zip: a CSV file containing the patients of the training set.\n - release_validate_patients.zip: a CSV file containing the patients of the validation set.\n - release_test_patients.zip: a CSV file containing the patients of the test set.",
"## Evidence Description\n\nEach evidence in the 'release_evidences.json' file is described using the following entries:\n - name: name of the evidence.\n - code_question: a code allowing to identify which evidences are related. Evidences having the same 'code_question' form a group of related symptoms. The value of the 'code_question' refers to the evidence that need to be simulated/activated for the other members of the group to be eventually simulated.\n - question_fr: the query, in French, associated to the evidence.\n - question_en: the query, in English, associated to the evidence.\n - is_antecedent: a flag indicating whether the evidence is an antecedent or a symptom.\n - data_type: the type of evidence. We use 'B' for binary, 'C' for categorical, and 'M' for multi-choice evidences.\n - default_value: the default value of the evidence. If this value is used to characterize the evidence, then it is as if the evidence was not synthesized.\n - possible-values: the possible values for the evidences. Only valid for categorical and multi-choice evidences.\n - value_meaning: The meaning, in French and English, of each code that is part of the 'possible-values' field. Only valid for categorical and multi-choice evidences.",
"## Pathology Description\nThe file 'release_conditions.json' contains information about the pathologies that patients in the datasets may suffer from. Each pathology has the following attributes:\n - condition_name: name of the pathology.\n - cond-name-fr: name of the pathology in French.\n - cond-name-eng: name of the pathology in English.\n - icd10-id: ICD-10 code of the pathology.\n - severity: the severity associated with the pathology. The lower the more severe.\n - symptoms: data structure describing the set of symptoms characterizing the pathology. Each symptom is represented by its corresponding 'name' entry in the 'release_evidences.json' file.\n - antecedents: data structure describing the set of antecedents characterizing the pathology. Each antecedent is represented by its corresponding 'name' entry in the 'release_evidences.json' file.",
"## Patient Description\n\nEach patient in each of the 3 sets has the following attributes:\n - AGE: the age of the synthesized patient.\n - SEX: the sex of the synthesized patient.\n - PATHOLOGY: name of the ground truth pathology ('condition_name' property in the 'release_conditions.json' file) that the synthesized patient is suffering from.\n - EVIDENCES: list of evidences experienced by the patient. An evidence can either be binary, categorical or multi-choice. A categorical or multi-choice evidence is represented in the format '[evidence-name]_@_[evidence-value]' where ['evidence-name'] is the name of the evidence ('name' entry in the 'release_evidences.json' file) and ['evidence-value'] is a value from the 'possible-values' entry. Note that for a multi-choice evidence, it is possible to have several '[evidence-name]_@_[evidence-value]' items in the evidence list, with each item being associated with a different evidence value. A binary evidence is represented as '[evidence-name]'.\n - INITIAL_EVIDENCE: the evidence provided by the patient to kick-start an interaction with an ASD/AD system. This is useful during model evaluation for a fair comparison of ASD/AD systems as they will all begin an interaction with a given patient from the same starting point. The initial evidence is randomly selected from the binary evidences found in the evidence list mentioned above (i.e., 'EVIDENCES') and it is part of this list.\n - DIFFERENTIAL_DIAGNOSIS: The ground truth differential diagnosis for the patient. It is represented as a list of pairs of the form '[[patho_1, proba_1], [patho_2, proba_2], ...]' where 'patho_i' is the pathology name ('condition_name' entry in the 'release_conditions.json' file) and 'proba_i' is its related probability.",
"## Note:\n\nWe hope this dataset will encourage future works for ASD and AD systems that consider the differential diagnosis and the severity of pathologies. It is important to keep in mind that this dataset is formed of synthetic patients and is meant for research purposes. Given the assumptions made during the generation process of this dataset, we would like to emphasize that the dataset should not be used to train and deploy a model prior to performing rigorous evaluations of the model performance and verifying that the system has proper coverage and representation of the population that it will interact with.\n\nIt is important to understand that the level of specificity, sensitivity and confidence that a physician will seek when evaluating a patient will be influenced by the clinical setting. The dataset was built for acute care and biased toward high mortality and morbidity pathologies. Physicians will tend to consider negative evidences as equally important in such a clinical context in order to evaluate high acuity diseases.\n\nIn the creation of the DDXPlus dataset, a small subset of the diseases was chosen to establish a baseline. Medical professionals have to consider this very important point when reviewing the results of models trained with this dataset, as the differential is considerably smaller. A smaller differential means less potential evidences to collect. It is thus essential to understand this point when we look at the differential produced and the evidence collected by a model based on this dataset.\n\nFor more information, please check our paper."
] | [
"TAGS\n#task_categories-tabular-classification #task_ids-multi-class-classification #size_categories-1K<n<10K #source_datasets-original #language-English #license-cc-by-4.0 #automatic-diagnosis #automatic-symptom-detection #differential-diagnosis #synthetic-patients #diseases #health-care #arxiv-2205.09148 #region-us \n",
"# Dataset Description\n\nWe are releasing under the CC-BY licence a new large-scale dataset for Automatic Symptom Detection (ASD) and Automatic Diagnosis (AD) systems in the medical domain. The dataset contains patients synthesized using a proprietary medical knowledge base and a commercial rule-based AD system. Patients in the dataset are characterized by their socio-demographic data, a pathology they are suffering from, a set of symptoms and antecedents related to this pathology, and a differential diagnosis. The symptoms and antecedents can be binary, categorical and multi-choice, with the potential of leading to more efficient and natural interactions between ASD/AD systems and patients. To the best of our knowledge, this is the first large-scale dataset that includes the differential diagnosis, and non-binary symptoms and antecedents.\n\nNote: We use evidence as a general term to refer to a symptom or an antecedent.\n\nThis directory contains the following files:\n - release_evidences.json: a JSON file describing all possible evidences considered in the dataset.\n - release_conditions.json: a JSON file describing all pathologies considered in the dataset.\n - release_train_patients.zip: a CSV file containing the patients of the training set.\n - release_validate_patients.zip: a CSV file containing the patients of the validation set.\n - release_test_patients.zip: a CSV file containing the patients of the test set.",
"## Evidence Description\n\nEach evidence in the 'release_evidences.json' file is described using the following entries:\n - name: name of the evidence.\n - code_question: a code allowing to identify which evidences are related. Evidences having the same 'code_question' form a group of related symptoms. The value of the 'code_question' refers to the evidence that need to be simulated/activated for the other members of the group to be eventually simulated.\n - question_fr: the query, in French, associated to the evidence.\n - question_en: the query, in English, associated to the evidence.\n - is_antecedent: a flag indicating whether the evidence is an antecedent or a symptom.\n - data_type: the type of evidence. We use 'B' for binary, 'C' for categorical, and 'M' for multi-choice evidences.\n - default_value: the default value of the evidence. If this value is used to characterize the evidence, then it is as if the evidence was not synthesized.\n - possible-values: the possible values for the evidences. Only valid for categorical and multi-choice evidences.\n - value_meaning: The meaning, in French and English, of each code that is part of the 'possible-values' field. Only valid for categorical and multi-choice evidences.",
"## Pathology Description\nThe file 'release_conditions.json' contains information about the pathologies that patients in the datasets may suffer from. Each pathology has the following attributes:\n - condition_name: name of the pathology.\n - cond-name-fr: name of the pathology in French.\n - cond-name-eng: name of the pathology in English.\n - icd10-id: ICD-10 code of the pathology.\n - severity: the severity associated with the pathology. The lower the more severe.\n - symptoms: data structure describing the set of symptoms characterizing the pathology. Each symptom is represented by its corresponding 'name' entry in the 'release_evidences.json' file.\n - antecedents: data structure describing the set of antecedents characterizing the pathology. Each antecedent is represented by its corresponding 'name' entry in the 'release_evidences.json' file.",
"## Patient Description\n\nEach patient in each of the 3 sets has the following attributes:\n - AGE: the age of the synthesized patient.\n - SEX: the sex of the synthesized patient.\n - PATHOLOGY: name of the ground truth pathology ('condition_name' property in the 'release_conditions.json' file) that the synthesized patient is suffering from.\n - EVIDENCES: list of evidences experienced by the patient. An evidence can either be binary, categorical or multi-choice. A categorical or multi-choice evidence is represented in the format '[evidence-name]_@_[evidence-value]' where ['evidence-name'] is the name of the evidence ('name' entry in the 'release_evidences.json' file) and ['evidence-value'] is a value from the 'possible-values' entry. Note that for a multi-choice evidence, it is possible to have several '[evidence-name]_@_[evidence-value]' items in the evidence list, with each item being associated with a different evidence value. A binary evidence is represented as '[evidence-name]'.\n - INITIAL_EVIDENCE: the evidence provided by the patient to kick-start an interaction with an ASD/AD system. This is useful during model evaluation for a fair comparison of ASD/AD systems as they will all begin an interaction with a given patient from the same starting point. The initial evidence is randomly selected from the binary evidences found in the evidence list mentioned above (i.e., 'EVIDENCES') and it is part of this list.\n - DIFFERENTIAL_DIAGNOSIS: The ground truth differential diagnosis for the patient. It is represented as a list of pairs of the form '[[patho_1, proba_1], [patho_2, proba_2], ...]' where 'patho_i' is the pathology name ('condition_name' entry in the 'release_conditions.json' file) and 'proba_i' is its related probability.",
"## Note:\n\nWe hope this dataset will encourage future works for ASD and AD systems that consider the differential diagnosis and the severity of pathologies. It is important to keep in mind that this dataset is formed of synthetic patients and is meant for research purposes. Given the assumptions made during the generation process of this dataset, we would like to emphasize that the dataset should not be used to train and deploy a model prior to performing rigorous evaluations of the model performance and verifying that the system has proper coverage and representation of the population that it will interact with.\n\nIt is important to understand that the level of specificity, sensitivity and confidence that a physician will seek when evaluating a patient will be influenced by the clinical setting. The dataset was built for acute care and biased toward high mortality and morbidity pathologies. Physicians will tend to consider negative evidences as equally important in such a clinical context in order to evaluate high acuity diseases.\n\nIn the creation of the DDXPlus dataset, a small subset of the diseases was chosen to establish a baseline. Medical professionals have to consider this very important point when reviewing the results of models trained with this dataset, as the differential is considerably smaller. A smaller differential means less potential evidences to collect. It is thus essential to understand this point when we look at the differential produced and the evidence collected by a model based on this dataset.\n\nFor more information, please check our paper."
] |
5f1f2021485bb9d397d72af7a0165e0a4223b923 | Optical flows associated with our work "We're Not Using Videos Effectively: An Updated Domain Adaptive Video Segmentation Baseline"
See the [github](https://github.com/SimarKareer/UnifiedVideoDA) for full instructions, but to install run
```bash
git lfs install
git clone https://huggingface.co/datasets/hoffman-lab/Unified-VideoDA-Generated-Flows
``` | hoffman-lab/Unified-VideoDA-Generated-Flows | [
"region:us"
] | 2024-01-22T03:50:49+00:00 | {} | 2024-01-28T01:02:47+00:00 | [] | [] | TAGS
#region-us
| Optical flows associated with our work "We're Not Using Videos Effectively: An Updated Domain Adaptive Video Segmentation Baseline"
See the github for full instructions, but to install run
| [] | [
"TAGS\n#region-us \n"
] |
196f18dd036583ab40e4f7d74696f495f6a178f8 |
# Dataset of Jeanne d'Arc (Dark) (Granblue Fantasy)
This is the dataset of Jeanne d'Arc (Dark) (Granblue Fantasy), containing 74 images and their tags.
The core tags of this character are `long_hair, hair_ornament, breasts, white_hair, red_eyes, large_breasts, bangs, hair_flower, medium_breasts, very_long_hair, wings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 74 | 96.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 74 | 62.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 171 | 123.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 74 | 88.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 171 | 163.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dark_jeanne_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, smile, solo, bare_shoulders, collarbone, navel, black_bikini, blush, feather_hair_ornament, flower, hair_between_eyes, official_alternate_costume, see-through, simple_background, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, smile, solo, looking_at_viewer, simple_background, armor, black_gloves, collarbone, dress, feather_hair_ornament, white_background |
| 2 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, armor, feathers, holding_sword, cleavage, bare_shoulders, collarbone, black_gloves, boots, ahoge, open_mouth, single_glove, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | smile | solo | bare_shoulders | collarbone | navel | black_bikini | blush | feather_hair_ornament | flower | hair_between_eyes | official_alternate_costume | see-through | simple_background | white_background | armor | black_gloves | dress | feathers | holding_sword | boots | ahoge | open_mouth | single_glove | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:--------|:-------|:-----------------|:-------------|:--------|:---------------|:--------|:------------------------|:---------|:--------------------|:-----------------------------|:--------------|:--------------------|:-------------------|:--------|:---------------|:--------|:-----------|:----------------|:--------|:--------|:-------------|:---------------|:--------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | | | | | X | X | X | X | X | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | X | X | | X | X | X | X | X | X | X |
| CyberHarem/dark_jeanne_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-22T03:57:04+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-22T04:11:03+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Jeanne d'Arc (Dark) (Granblue Fantasy)
=================================================
This is the dataset of Jeanne d'Arc (Dark) (Granblue Fantasy), containing 74 images and their tags.
The core tags of this character are 'long\_hair, hair\_ornament, breasts, white\_hair, red\_eyes, large\_breasts, bangs, hair\_flower, medium\_breasts, very\_long\_hair, wings', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
158d0a690d51f9a8e6f61d55dd30105c67eefb8e |
Two types of questions are in this dataset.
* Synthetic generated questions and answers from Chatgpt 3.5 Turbo on US History.
* This included a python script to first randomly select a date between 1607-2020. Then generate a historical question
from that date. Repeated 3000 times.
* This is along with a curated 2048 character chunked American Yawp and Openstax American History,
that were feed into Chatgpt 3.5 Turbo one chunk at a time to generate the context, then a question from
that context, then an answer from that question.
Python script used to generate synthetic history questions and answer from ChatGPT Turbo-Instruct.
```
import pandas as pd
import random
import openai
from datetime import datetime
# Set your OpenAI API key
openai.api_key = "API KEY"
# Function to generate synthetic historical questions using OpenAI GPT-3
def generate_question():
# Define a prompt to get a historical question from GPT-3
prompt = "Generate a historical question related to American History from 1607-2020."
# Call the OpenAI API to get a synthetic question
response = openai.Completion.create(
engine="gpt-3.5-turbo-instruct",
prompt=prompt,
max_tokens=50,
n=1,
stop=None
)
# Extract the generated question from the API response
question = response['choices'][0]['text'].strip()
# Generate input and output based on the selected question
context = f"Make sure the answer does not include any form of list, and that it completes the prompt with a complete sentence."
input_text = "" # Set input_text to an empty string
# Instead of using the question as the answer, let's generate a concise response
response_prompt = f"{context} Answer the following question: {question} "
answer_response = openai.Completion.create(
engine="gpt-3.5-turbo-instruct",
prompt=response_prompt,
max_tokens=50,
n=1,
stop=None
)
# Check the token count and adjust the prompt if needed
remaining_tokens = 100 - answer_response['usage']['total_tokens']
if remaining_tokens < 10:
# If there are fewer than 10 tokens left, reduce the max tokens in the response
answer_response = openai.Completion.create(
engine="gpt-3.5-turbo-instruct",
prompt=response_prompt,
max_tokens=answer_response['usage']['total_tokens'] + 10,
n=1,
stop=None
)
output_text = f"{answer_response['choices'][0]['text'].strip()}"
return question, input_text, output_text
# Create a DataFrame to store the synthetic dataset
data = {'instruction': [], 'input': [], 'output': []}
# Generate 10 entries
for _ in range(100):
question, input_text, output_text = generate_question()
data['instruction'].append(question)
data['input'].append(input_text)
data['output'].append(output_text)
# Convert the dictionary to a DataFrame
df = pd.DataFrame(data)
# Generate a timestamp for the unique filename
timestamp = datetime.now().strftime("%Y%m%d%H%M%S")
# Save the DataFrame to a CSV file with a unique filename
file_path = f'/content/data_sets/data_set_{timestamp}.csv'
df.to_csv(file_path, index=False)
print(f"Synthetic dataset generated and saved to '{file_path}'")
```
This is the python script to segment books or text files into chunks, then generate a question and answer into a csv. This does produce some
misalignments between cells. So make sure to look over the data when completed for any data issues. This was generated through colab, so !pip
instruction are present.
## Generating Chunks
Chunking the sizes helps depending if you want to modify the llm to something with a larger token window. I tried to keep mine within the
token limit of ChatGPT 3.5 Turbo 1106 for the larger context window.
```
# Commented out IPython magic to ensure Python compatibility.
!pip install openai==0.28.0
!pip install pandas
# %mkdir textfiles
# %mkdir chunks
import os
input_file = "/content/textfiles/text.txt"
chunk_size = 8192
output_dir = "chunks"
if not os.path.exists(output_dir):
os.makedirs(output_dir)
with open(input_file) as f:
text = f.read()
print(f"Original text length: {len(text)}")
chunks = []
for i in range(0, len(text), chunk_size):
chunks.append(text[i:i+chunk_size])
last_chunk = text[i+chunk_size:]
if last_chunk:
chunks.append(last_chunk)
total_chunks = "".join(chunks)
assert total_chunks == text, "Text does not match chunks"
print(f"Total chunks length: {len(total_chunks)}")
print(f"Num chunks: {len(chunks)}")
for i, chunk in enumerate(chunks):
chunk_file = os.path.join(output_dir, f"chunk{i}.txt")
with open(chunk_file, "w") as f:
f.write(chunk)
print(f"Saved {len(chunks)} chunks to {output_dir}")
```
## Generating text (questions and answers) and formating
Using the chatgpt to generate the context of the chunk, then having it create a historical question from that text. Then having it create
the answer. (At time it wanted to put the answers into a summarized list, so I add to the instruction NOT to include it in a list. This
has seemed to work a little better.)
The last part is to bring it all into a csv file from the format generated from the code above. Make sure that you have made backup to the
chunks gereated by the first python script. If you wish to keep them. The last part of this code deletes the last chunk that it created
an answer and question for. This helped me figure out where I need to stop and start it next time.
```
import os
import openai
import pandas as pd
# Set your OpenAI API key
openai.api_key = 'API KEY'
chunk_dir = "/content/chunks"
qa_dir = "/content/qa_chunks"
# Ensure the output directory exists
if not os.path.exists(qa_dir):
os.makedirs(qa_dir)
# Function to get historical context and Q&A
def process_chunk(chunk):
# Get historical context
context_response = openai.ChatCompletion.create(
model="gpt-3.5-turbo-1106",
messages=[{"role": "system", "content": "Determine the historical context of the following text."},
{"role": "user", "content": chunk}]
)
context = context_response['choices'][0]['message']['content']
# Generate Q&A
questions = []
answers = []
for i in range(5):
question_response = openai.ChatCompletion.create(
model="gpt-3.5-turbo-1106",
messages=[{"role": "system", "content": "Generate a different question about the historical context of the text."},
{"role": "user", "content": chunk}]
)
question = question_response['choices'][0]['message']['content']
questions.append(question)
answer_response = openai.ChatCompletion.create(
model="gpt-3.5-turbo-1106",
messages=[{"role": "system", "content": f"Answer this question based on the text, do not put the answer into a list: {question}"},
{"role": "user", "content": chunk}]
)
answer = answer_response['choices'][0]['message']['content']
answers.append(answer)
return questions, answers
# Function to create a CSV file
def create_csv(questions, answers, filename):
df = pd.DataFrame({'Question': questions, 'Answer': answers})
df.to_csv(filename, index=False)
# Read and process each chunk with a pause after every 50 chunks
chunk_counter = 0
# Retrieve and sort chunk file names
chunk_files = sorted([f for f in os.listdir(chunk_dir) if f.startswith("chunk") and f.endswith(".txt")], key=lambda x: int(x[5:-4]))
for chunk_file in chunk_files:
chunk_path = os.path.join(chunk_dir, chunk_file)
with open(chunk_path, 'r') as f:
chunk = f.read()
questions, answers = process_chunk(chunk)
csv_file = os.path.join(qa_dir, f"qa_{chunk_file[:-4]}.csv")
create_csv(questions, answers, csv_file)
print(f"Processed {chunk_file}, questions and answers saved to {csv_file}")
# Delete the chunk file after processing
os.remove(chunk_path)
print(f"Deleted processed chunk file: {chunk_file}")
chunk_counter += 1
if chunk_counter % 50 == 0:
input("Processed 50 chunks. Press Enter to continue...")
print("Processing complete.")
``` | ambrosfitz/mighty-history-merge | [
"license:cc-by-4.0",
"region:us"
] | 2024-01-22T04:27:12+00:00 | {"license": "cc-by-4.0"} | 2024-01-25T23:43:43+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
|
Two types of questions are in this dataset.
* Synthetic generated questions and answers from Chatgpt 3.5 Turbo on US History.
* This included a python script to first randomly select a date between 1607-2020. Then generate a historical question
from that date. Repeated 3000 times.
* This is along with a curated 2048 character chunked American Yawp and Openstax American History,
that were feed into Chatgpt 3.5 Turbo one chunk at a time to generate the context, then a question from
that context, then an answer from that question.
Python script used to generate synthetic history questions and answer from ChatGPT Turbo-Instruct.
This is the python script to segment books or text files into chunks, then generate a question and answer into a csv. This does produce some
misalignments between cells. So make sure to look over the data when completed for any data issues. This was generated through colab, so !pip
instruction are present.
## Generating Chunks
Chunking the sizes helps depending if you want to modify the llm to something with a larger token window. I tried to keep mine within the
token limit of ChatGPT 3.5 Turbo 1106 for the larger context window.
## Generating text (questions and answers) and formating
Using the chatgpt to generate the context of the chunk, then having it create a historical question from that text. Then having it create
the answer. (At time it wanted to put the answers into a summarized list, so I add to the instruction NOT to include it in a list. This
has seemed to work a little better.)
The last part is to bring it all into a csv file from the format generated from the code above. Make sure that you have made backup to the
chunks gereated by the first python script. If you wish to keep them. The last part of this code deletes the last chunk that it created
an answer and question for. This helped me figure out where I need to stop and start it next time.
| [
"## Generating Chunks\n\nChunking the sizes helps depending if you want to modify the llm to something with a larger token window. I tried to keep mine within the\ntoken limit of ChatGPT 3.5 Turbo 1106 for the larger context window.",
"## Generating text (questions and answers) and formating\n\nUsing the chatgpt to generate the context of the chunk, then having it create a historical question from that text. Then having it create\nthe answer. (At time it wanted to put the answers into a summarized list, so I add to the instruction NOT to include it in a list. This \nhas seemed to work a little better.)\n\nThe last part is to bring it all into a csv file from the format generated from the code above. Make sure that you have made backup to the\nchunks gereated by the first python script. If you wish to keep them. The last part of this code deletes the last chunk that it created\nan answer and question for. This helped me figure out where I need to stop and start it next time."
] | [
"TAGS\n#license-cc-by-4.0 #region-us \n",
"## Generating Chunks\n\nChunking the sizes helps depending if you want to modify the llm to something with a larger token window. I tried to keep mine within the\ntoken limit of ChatGPT 3.5 Turbo 1106 for the larger context window.",
"## Generating text (questions and answers) and formating\n\nUsing the chatgpt to generate the context of the chunk, then having it create a historical question from that text. Then having it create\nthe answer. (At time it wanted to put the answers into a summarized list, so I add to the instruction NOT to include it in a list. This \nhas seemed to work a little better.)\n\nThe last part is to bring it all into a csv file from the format generated from the code above. Make sure that you have made backup to the\nchunks gereated by the first python script. If you wish to keep them. The last part of this code deletes the last chunk that it created\nan answer and question for. This helped me figure out where I need to stop and start it next time."
] |
c1ada1560eea70c9fa352792222cf28418f54b04 |
...
dataset_info:
features:
- name: id
dtype: string
- name: Label
dtype: string | aidystark/FOOT40K | [
"task_categories:image-classification",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-22T04:28:26+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification"], "pretty_name": "FOOT40k"} | 2024-01-22T11:27:10+00:00 | [] | [
"en"
] | TAGS
#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
|
...
dataset_info:
features:
- name: id
dtype: string
- name: Label
dtype: string | [] | [
"TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n"
] |
7e625c4f95bd52d7147d6fc97d7829fab59ec976 | # Dataset Card for "processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | loganengstrom/dsdm-candidate-c4 | [
"region:us"
] | 2024-01-22T04:38:16+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "uint16"}], "splits": [{"name": "train", "num_bytes": 445178826792, "num_examples": 216948746}], "download_size": 0, "dataset_size": 445178826792}} | 2024-01-23T10:42:32+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "processed"
More Information needed | [
"# Dataset Card for \"processed\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"processed\"\n\nMore Information needed"
] |
53dbb883a0a484a96c3d0e6d40b881c8f59c4c7b |
This dataset was generated by reformatting [`coref-data/arrau_raw`](https://huggingface.co/datasets/coref-data/arrau_raw) into the indiscrim coreference format. See that repo for dataset details.
See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| coref-data/arrau_indiscrim | [
"region:us"
] | 2024-01-22T04:44:39+00:00 | {"dataset_info": {"features": [{"name": "split", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "misc", "struct": [{"name": "parse_tree", "dtype": "string"}]}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "end_char", "dtype": "int64"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "start_char", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 34956353, "num_examples": 444}, {"name": "validation", "num_bytes": 3984498, "num_examples": 33}, {"name": "test", "num_bytes": 5549898, "num_examples": 75}], "download_size": 9374318, "dataset_size": 44490749}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-13T04:15:47+00:00 | [] | [] | TAGS
#region-us
|
This dataset was generated by reformatting 'coref-data/arrau_raw' into the indiscrim coreference format. See that repo for dataset details.
See ianporada/coref-data for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| [] | [
"TAGS\n#region-us \n"
] |
ab2187f93350722902a1364341aeaf1e2a02fc3d |
# Dataset Card for Evaluation run of inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24](https://huggingface.co/inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T04:44:53.381027](https://huggingface.co/datasets/open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24/blob/main/results_2024-01-22T04-44-53.381027.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5189240030169358,
"acc_stderr": 0.03417423514779615,
"acc_norm": 0.5233187157188728,
"acc_norm_stderr": 0.03491752755385364,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729405,
"mc2": 0.530042963383804,
"mc2_stderr": 0.014928626205495087
},
"harness|arc:challenge|25": {
"acc": 0.5307167235494881,
"acc_stderr": 0.014583792546304038,
"acc_norm": 0.5563139931740614,
"acc_norm_stderr": 0.014518421825670452
},
"harness|hellaswag|10": {
"acc": 0.6161123282214698,
"acc_stderr": 0.004853371646239246,
"acc_norm": 0.813483369846644,
"acc_norm_stderr": 0.0038872693686016107
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.0314895582974553,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.0314895582974553
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.034711928605184676,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.034711928605184676
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.03247390276569669,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.03247390276569669
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.689908256880734,
"acc_stderr": 0.019830849684439756,
"acc_norm": 0.689908256880734,
"acc_norm_stderr": 0.019830849684439756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.03296245110172229,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.03296245110172229
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.032928028193303135,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.032928028193303135
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.046166311118017125,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.046166311118017125
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.03874102859818082,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.03874102859818082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.016328814422102052,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.016328814422102052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.02618966696627204,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.02618966696627204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.028472938478033526,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.028472938478033526
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402605,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402605
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.029189805673587095,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.029189805673587095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39765319426336376,
"acc_stderr": 0.012499840347460643,
"acc_norm": 0.39765319426336376,
"acc_norm_stderr": 0.012499840347460643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969768,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969768
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.032658195885126966,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.032658195885126966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729405,
"mc2": 0.530042963383804,
"mc2_stderr": 0.014928626205495087
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.01183587216483667
},
"harness|gsm8k|5": {
"acc": 0.23199393479909022,
"acc_stderr": 0.01162687317509241
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24 | [
"region:us"
] | 2024-01-22T04:46:42+00:00 | {"pretty_name": "Evaluation run of inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24", "dataset_summary": "Dataset automatically created during the evaluation run of model [inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24](https://huggingface.co/inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T04:44:53.381027](https://huggingface.co/datasets/open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24/blob/main/results_2024-01-22T04-44-53.381027.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5189240030169358,\n \"acc_stderr\": 0.03417423514779615,\n \"acc_norm\": 0.5233187157188728,\n \"acc_norm_stderr\": 0.03491752755385364,\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.016776599676729405,\n \"mc2\": 0.530042963383804,\n \"mc2_stderr\": 0.014928626205495087\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5307167235494881,\n \"acc_stderr\": 0.014583792546304038,\n \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.014518421825670452\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6161123282214698,\n \"acc_stderr\": 0.004853371646239246,\n \"acc_norm\": 0.813483369846644,\n \"acc_norm_stderr\": 0.0038872693686016107\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.0314895582974553,\n \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.0314895582974553\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.034711928605184676,\n \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.034711928605184676\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6515151515151515,\n \"acc_stderr\": 0.033948539651564025,\n \"acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.033948539651564025\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.03247390276569669,\n \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.03247390276569669\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.689908256880734,\n \"acc_stderr\": 0.019830849684439756,\n \"acc_norm\": 0.689908256880734,\n \"acc_norm_stderr\": 0.019830849684439756\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.03296245110172229,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.03296245110172229\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.03874102859818082,\n \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.03874102859818082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.016328814422102052,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.016328814422102052\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.02618966696627204,\n \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.02618966696627204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.028472938478033526,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.028472938478033526\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402605,\n \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402605\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3971631205673759,\n \"acc_stderr\": 0.029189805673587095,\n \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.029189805673587095\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39765319426336376,\n \"acc_stderr\": 0.012499840347460643,\n \"acc_norm\": 0.39765319426336376,\n \"acc_norm_stderr\": 0.012499840347460643\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969768,\n \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969768\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n \"acc_stderr\": 0.032658195885126966,\n \"acc_norm\": 0.6915422885572139,\n \"acc_norm_stderr\": 0.032658195885126966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.016776599676729405,\n \"mc2\": 0.530042963383804,\n \"mc2_stderr\": 0.014928626205495087\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.01183587216483667\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23199393479909022,\n \"acc_stderr\": 0.01162687317509241\n }\n}\n```", "repo_url": "https://huggingface.co/inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|arc:challenge|25_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|gsm8k|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hellaswag|10_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T04-44-53.381027.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["**/details_harness|winogrande|5_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T04-44-53.381027.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T04_44_53.381027", "path": ["results_2024-01-22T04-44-53.381027.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T04-44-53.381027.parquet"]}]}]} | 2024-01-22T04:47:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24
Dataset automatically created during the evaluation run of model inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T04:44:53.381027(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24\n\n\n\nDataset automatically created during the evaluation run of model inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T04:44:53.381027(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24\n\n\n\nDataset automatically created during the evaluation run of model inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T04:44:53.381027(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8376f494f02a85d47069d61c351c27d7bbe72451 |
# Dataset Card for Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32](https://huggingface.co/kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.32",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T04:57:39.972792](https://huggingface.co/datasets/open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.32/blob/main/results_2024-01-22T04-57-39.972792.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.626011070759908,
"acc_stderr": 0.03257825702529057,
"acc_norm": 0.6346917759501802,
"acc_norm_stderr": 0.033311769083313715,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.5118901360350582,
"mc2_stderr": 0.015028312827746176
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.014449464278868809,
"acc_norm": 0.6186006825938567,
"acc_norm_stderr": 0.014194389086685247
},
"harness|hellaswag|10": {
"acc": 0.6520613423620792,
"acc_stderr": 0.004753429806645434,
"acc_norm": 0.8466440948018323,
"acc_norm_stderr": 0.0035959381241662137
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.0255064816981382,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.0255064816981382
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.03123475237772117,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03123475237772117
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215293,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215293
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919426,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684803,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.02336387809663245,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.02336387809663245
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.04142313771996665,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.04142313771996665
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.015445716910998874,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.015445716910998874
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156837,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156837
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004903,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004903
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48370273794002605,
"acc_stderr": 0.012763450734699812,
"acc_norm": 0.48370273794002605,
"acc_norm_stderr": 0.012763450734699812
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882537,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.5118901360350582,
"mc2_stderr": 0.015028312827746176
},
"harness|winogrande|5": {
"acc": 0.8279400157853196,
"acc_stderr": 0.010607731615247005
},
"harness|gsm8k|5": {
"acc": 0.1508718726307809,
"acc_stderr": 0.009859004137305687
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.32 | [
"region:us"
] | 2024-01-22T05:00:03+00:00 | {"pretty_name": "Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32", "dataset_summary": "Dataset automatically created during the evaluation run of model [kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32](https://huggingface.co/kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.32\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T04:57:39.972792](https://huggingface.co/datasets/open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.32/blob/main/results_2024-01-22T04-57-39.972792.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.626011070759908,\n \"acc_stderr\": 0.03257825702529057,\n \"acc_norm\": 0.6346917759501802,\n \"acc_norm_stderr\": 0.033311769083313715,\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5118901360350582,\n \"mc2_stderr\": 0.015028312827746176\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.014449464278868809,\n \"acc_norm\": 0.6186006825938567,\n \"acc_norm_stderr\": 0.014194389086685247\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6520613423620792,\n \"acc_stderr\": 0.004753429806645434,\n \"acc_norm\": 0.8466440948018323,\n \"acc_norm_stderr\": 0.0035959381241662137\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.032579014820998356,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.032579014820998356\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.0255064816981382,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.0255064816981382\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03123475237772117,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03123475237772117\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215293,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215293\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919426,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919426\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684803,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.02336387809663245,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.02336387809663245\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.04142313771996665,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.04142313771996665\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.015445716910998874,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.015445716910998874\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156837,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156837\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n \"acc_stderr\": 0.026730620728004903,\n \"acc_norm\": 0.6688102893890675,\n \"acc_norm_stderr\": 0.026730620728004903\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48370273794002605,\n \"acc_stderr\": 0.012763450734699812,\n \"acc_norm\": 0.48370273794002605,\n \"acc_norm_stderr\": 0.012763450734699812\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882537,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882537\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5118901360350582,\n \"mc2_stderr\": 0.015028312827746176\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247005\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1508718726307809,\n \"acc_stderr\": 0.009859004137305687\n }\n}\n```", "repo_url": "https://huggingface.co/kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|arc:challenge|25_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|gsm8k|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hellaswag|10_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T04-57-39.972792.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["**/details_harness|winogrande|5_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T04-57-39.972792.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T04_57_39.972792", "path": ["results_2024-01-22T04-57-39.972792.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T04-57-39.972792.parquet"]}]}]} | 2024-01-22T05:00:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32
Dataset automatically created during the evaluation run of model kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T04:57:39.972792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32\n\n\n\nDataset automatically created during the evaluation run of model kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T04:57:39.972792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32\n\n\n\nDataset automatically created during the evaluation run of model kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.32 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T04:57:39.972792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
98b6c399e15088041c7a4bb0efc861c762a77a05 |
This dataset was generated by reformatting [`coref-data/phrase_detectives_raw`](https://huggingface.co/datasets/coref-data/phrase_detectives_raw) into the indiscrim coreference format. See that repo for dataset details.
See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| coref-data/phrase_detectives_indiscrim | [
"region:us"
] | 2024-01-22T05:09:50+00:00 | {"dataset_info": {"features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 43394172.38513514, "num_examples": 695}, {"name": "validation", "num_bytes": 2809694.614864865, "num_examples": 45}, {"name": "test", "num_bytes": 847618, "num_examples": 45}], "download_size": 13119886, "dataset_size": 47051485.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-22T05:09:53+00:00 | [] | [] | TAGS
#region-us
|
This dataset was generated by reformatting 'coref-data/phrase_detectives_raw' into the indiscrim coreference format. See that repo for dataset details.
See ianporada/coref-data for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| [] | [
"TAGS\n#region-us \n"
] |
1f1f6cfa167165944ee5f388e7c6ef412406c052 |
This dataset was generated by reformatting [`coref-data/korean_ecmt_raw`](https://huggingface.co/datasets/coref-data/korean_ecmt_raw) into the indiscrim coreference format. See that repo for dataset details.
See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| coref-data/korean_ecmt_indiscrim | [
"region:us"
] | 2024-01-22T05:20:21+00:00 | {"dataset_info": {"features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 36047013, "num_examples": 1345}, {"name": "validation", "num_bytes": 3639179, "num_examples": 135}, {"name": "test", "num_bytes": 3703845, "num_examples": 207}], "download_size": 11763612, "dataset_size": 43390037}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-22T05:21:37+00:00 | [] | [] | TAGS
#region-us
|
This dataset was generated by reformatting 'coref-data/korean_ecmt_raw' into the indiscrim coreference format. See that repo for dataset details.
See ianporada/coref-data for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| [] | [
"TAGS\n#region-us \n"
] |
118b64c21ee3e807b46c0ec3f23022f51d65b7c0 |
# Dataset Card for Evaluation run of LordNoah/Alpaca-tuned-gpt2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LordNoah/Alpaca-tuned-gpt2](https://huggingface.co/LordNoah/Alpaca-tuned-gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LordNoah__Alpaca-tuned-gpt2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T05:37:05.003524](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca-tuned-gpt2/blob/main/results_2024-01-22T05-37-05.003524.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27365684756436603,
"acc_stderr": 0.03148180517103422,
"acc_norm": 0.27504156315130057,
"acc_norm_stderr": 0.0322748672143349,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396736,
"mc2": 0.3764778441319414,
"mc2_stderr": 0.014234316118661302
},
"harness|arc:challenge|25": {
"acc": 0.25597269624573377,
"acc_stderr": 0.012753013241244518,
"acc_norm": 0.26535836177474403,
"acc_norm_stderr": 0.012902554762313964
},
"harness|hellaswag|10": {
"acc": 0.36367257518422624,
"acc_stderr": 0.004800728138792374,
"acc_norm": 0.4479187412865963,
"acc_norm_stderr": 0.004962638446395992
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.036906779861372814,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.036906779861372814
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33584905660377357,
"acc_stderr": 0.029067220146644823,
"acc_norm": 0.33584905660377357,
"acc_norm_stderr": 0.029067220146644823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.029379170464124818,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.029379170464124818
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3310344827586207,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.3310344827586207,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102147,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102147
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764822,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.31088082901554404,
"acc_stderr": 0.033403619062765885,
"acc_norm": 0.31088082901554404,
"acc_norm_stderr": 0.033403619062765885
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602357,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602357
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958948,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3321100917431193,
"acc_stderr": 0.020192682985423344,
"acc_norm": 0.3321100917431193,
"acc_norm_stderr": 0.020192682985423344
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012393,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012393
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879994,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879994
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20434227330779056,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.20434227330779056,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3215434083601286,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.3215434083601286,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.02612957252718085,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.02612957252718085
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927221,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927221
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.026799562024887678,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.026799562024887678
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.017077373377857002,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.017077373377857002
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.32653061224489793,
"acc_stderr": 0.030021056238440296,
"acc_norm": 0.32653061224489793,
"acc_norm_stderr": 0.030021056238440296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.24096385542168675,
"acc_stderr": 0.033293941190735296,
"acc_norm": 0.24096385542168675,
"acc_norm_stderr": 0.033293941190735296
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396736,
"mc2": 0.3764778441319414,
"mc2_stderr": 0.014234316118661302
},
"harness|winogrande|5": {
"acc": 0.5509076558800315,
"acc_stderr": 0.013979459389140839
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.0025049422268605213
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_LordNoah__Alpaca-tuned-gpt2 | [
"region:us"
] | 2024-01-22T05:38:24+00:00 | {"pretty_name": "Evaluation run of LordNoah/Alpaca-tuned-gpt2", "dataset_summary": "Dataset automatically created during the evaluation run of model [LordNoah/Alpaca-tuned-gpt2](https://huggingface.co/LordNoah/Alpaca-tuned-gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LordNoah__Alpaca-tuned-gpt2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T05:37:05.003524](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca-tuned-gpt2/blob/main/results_2024-01-22T05-37-05.003524.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27365684756436603,\n \"acc_stderr\": 0.03148180517103422,\n \"acc_norm\": 0.27504156315130057,\n \"acc_norm_stderr\": 0.0322748672143349,\n \"mc1\": 0.22276621787025705,\n \"mc1_stderr\": 0.014566506961396736,\n \"mc2\": 0.3764778441319414,\n \"mc2_stderr\": 0.014234316118661302\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.25597269624573377,\n \"acc_stderr\": 0.012753013241244518,\n \"acc_norm\": 0.26535836177474403,\n \"acc_norm_stderr\": 0.012902554762313964\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36367257518422624,\n \"acc_stderr\": 0.004800728138792374,\n \"acc_norm\": 0.4479187412865963,\n \"acc_norm_stderr\": 0.004962638446395992\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.036906779861372814,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.036906779861372814\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.33584905660377357,\n \"acc_stderr\": 0.029067220146644823,\n \"acc_norm\": 0.33584905660377357,\n \"acc_norm_stderr\": 0.029067220146644823\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.029379170464124818,\n \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.029379170464124818\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.03921545312467122,\n \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.03921545312467122\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n \"acc_stderr\": 0.03455071019102147,\n \"acc_norm\": 0.18253968253968253,\n \"acc_norm_stderr\": 0.03455071019102147\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764822,\n \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.31088082901554404,\n \"acc_stderr\": 0.033403619062765885,\n \"acc_norm\": 0.31088082901554404,\n \"acc_norm_stderr\": 0.033403619062765885\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602357,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602357\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958948,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958948\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3321100917431193,\n \"acc_stderr\": 0.020192682985423344,\n \"acc_norm\": 0.3321100917431193,\n \"acc_norm_stderr\": 0.020192682985423344\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012393,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012393\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n \"acc_stderr\": 0.020799400082879994,\n \"acc_norm\": 0.10762331838565023,\n \"acc_norm_stderr\": 0.020799400082879994\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.33884297520661155,\n \"acc_stderr\": 0.043207678075366705,\n \"acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.043207678075366705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20434227330779056,\n \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.20434227330779056,\n \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.02475241196091721,\n \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.02475241196091721\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3215434083601286,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.3215434083601286,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713002,\n \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.02612957252718085,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.02612957252718085\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n \"acc_stderr\": 0.011005971399927221,\n \"acc_norm\": 0.24641460234680573,\n \"acc_norm_stderr\": 0.011005971399927221\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.026799562024887678,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.026799562024887678\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.017077373377857002,\n \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.017077373377857002\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.32653061224489793,\n \"acc_stderr\": 0.030021056238440296,\n \"acc_norm\": 0.32653061224489793,\n \"acc_norm_stderr\": 0.030021056238440296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.24096385542168675,\n \"acc_stderr\": 0.033293941190735296,\n \"acc_norm\": 0.24096385542168675,\n \"acc_norm_stderr\": 0.033293941190735296\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n \"mc1_stderr\": 0.014566506961396736,\n \"mc2\": 0.3764778441319414,\n \"mc2_stderr\": 0.014234316118661302\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5509076558800315,\n \"acc_stderr\": 0.013979459389140839\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \"acc_stderr\": 0.0025049422268605213\n }\n}\n```", "repo_url": "https://huggingface.co/LordNoah/Alpaca-tuned-gpt2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|arc:challenge|25_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|gsm8k|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hellaswag|10_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T05-37-05.003524.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["**/details_harness|winogrande|5_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T05-37-05.003524.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T05_37_05.003524", "path": ["results_2024-01-22T05-37-05.003524.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T05-37-05.003524.parquet"]}]}]} | 2024-01-22T05:38:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of LordNoah/Alpaca-tuned-gpt2
Dataset automatically created during the evaluation run of model LordNoah/Alpaca-tuned-gpt2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T05:37:05.003524(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of LordNoah/Alpaca-tuned-gpt2\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca-tuned-gpt2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T05:37:05.003524(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of LordNoah/Alpaca-tuned-gpt2\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca-tuned-gpt2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T05:37:05.003524(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b974c2242204240c63ed7becc407fb3d9388028d |
This dataset was generated by reformatting [`coref-data/mmc_raw`](https://huggingface.co/datasets/coref-data/mmc_raw) into the indiscrim coreference format. See that repo for dataset details.
See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| coref-data/mmc_indiscrim | [
"region:us"
] | 2024-01-22T05:59:43+00:00 | {"dataset_info": [{"config_name": "mmc_en", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "misc", "struct": [{"name": "parse_tree", "dtype": "string"}]}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "end_char", "dtype": "int64"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "start_char", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 32714450, "num_examples": 955}, {"name": "validation", "num_bytes": 4684074, "num_examples": 134}, {"name": "test", "num_bytes": 3576454, "num_examples": 133}], "download_size": 8195117, "dataset_size": 40974978}, {"config_name": "mmc_fa", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 8511917, "num_examples": 950}, {"name": "validation", "num_bytes": 1308706, "num_examples": 134}, {"name": "test", "num_bytes": 959400, "num_examples": 133}], "download_size": 3083246, "dataset_size": 10780023}, {"config_name": "mmc_fa_corrected", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 8511917, "num_examples": 950}, {"name": "validation", "num_bytes": 1308706, "num_examples": 134}, {"name": "test", "num_bytes": 988920, "num_examples": 133}], "download_size": 3086246, "dataset_size": 10809543}, {"config_name": "mmc_zh_corrected", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 8024979, "num_examples": 948}, {"name": "validation", "num_bytes": 1217704, "num_examples": 134}, {"name": "test", "num_bytes": 765302, "num_examples": 133}], "download_size": 2653472, "dataset_size": 10007985}, {"config_name": "mmc_zh_uncorrected", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 8024979, "num_examples": 948}, {"name": "validation", "num_bytes": 1217704, "num_examples": 134}, {"name": "test", "num_bytes": 926344, "num_examples": 133}], "download_size": 2655536, "dataset_size": 10169027}], "configs": [{"config_name": "mmc_en", "data_files": [{"split": "train", "path": "mmc_en/train-*"}, {"split": "validation", "path": "mmc_en/validation-*"}, {"split": "test", "path": "mmc_en/test-*"}]}, {"config_name": "mmc_fa", "data_files": [{"split": "train", "path": "mmc_fa/train-*"}, {"split": "validation", "path": "mmc_fa/validation-*"}, {"split": "test", "path": "mmc_fa/test-*"}]}, {"config_name": "mmc_fa_corrected", "data_files": [{"split": "train", "path": "mmc_fa_corrected/train-*"}, {"split": "validation", "path": "mmc_fa_corrected/validation-*"}, {"split": "test", "path": "mmc_fa_corrected/test-*"}]}, {"config_name": "mmc_zh_corrected", "data_files": [{"split": "train", "path": "mmc_zh_corrected/train-*"}, {"split": "validation", "path": "mmc_zh_corrected/validation-*"}, {"split": "test", "path": "mmc_zh_corrected/test-*"}]}, {"config_name": "mmc_zh_uncorrected", "data_files": [{"split": "train", "path": "mmc_zh_uncorrected/train-*"}, {"split": "validation", "path": "mmc_zh_uncorrected/validation-*"}, {"split": "test", "path": "mmc_zh_uncorrected/test-*"}]}]} | 2024-02-13T04:04:52+00:00 | [] | [] | TAGS
#region-us
|
This dataset was generated by reformatting 'coref-data/mmc_raw' into the indiscrim coreference format. See that repo for dataset details.
See ianporada/coref-data for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| [] | [
"TAGS\n#region-us \n"
] |
59e92a9f79d72813c9b6111c5b492ec9c12adfa3 | # MMCBench Dataset: Benchmarking Dataset for Multimodal Model Evaluation 🚀
## Overview
The MMCBench Dataset is a curated collection of data designed for the comprehensive evaluation of Large Multimodal Models (LMMs) under common corruption scenarios. This dataset supports the MMCBench framework, focusing on cross-modal interactions involving text, image, and speech. It provides essential data for generative tasks such as text-to-image, image-to-text, text-to-speech, and speech-to-text, enabling robustness and self-consistency assessments of LMMs.
## Dataset Composition 📊
The MMCBench Dataset is structured to facilitate the evaluation across four key generative tasks:
- **Text-to-Image:** A collection of text descriptions with their corresponding corrupted versions and associated images.
- **Image-to-Text:** A set of images with clean and corrupted captions.
- **Text-to-Speech:** Text inputs with their clean and corrupted audio outputs.
- **Speech-to-Text:** Audio files with transcriptions before and after audio corruptions.
Each subset of the dataset has been meticulously selected and processed to represent challenging scenarios for LMMs.
## Using the Dataset 🛠️
To use the MMCBench Dataset for model evaluation:
1. **Access the Data**: The dataset is hosted on Hugging Face and can be accessed using their dataset library or direct download.
2. **Select the Task**: Choose from text-to-image, image-to-text, text-to-speech, or speech-to-text tasks based on your model's capabilities.
3. **Apply the Benchmark**: Utilize the data for each task to test your model's performance against various corruptions. Follow the [MMCBench](https://github.com/sail-sg/MMCBench/tree/main) framework for a consistent and standardized evaluation.
### Dataset Structure 📁
The dataset is organized into four main directories, each corresponding to one of the generative tasks:
- `text2image/`: Contains text inputs and associated images.
- `image2text/`: Comprises images and their descriptive captions.
- `text2speech/`: Includes text inputs and generated speech outputs.
- `speech2text/`: Contains audio files and their transcriptions.
## Contributing to the Dataset 🤝
Contributions to the MMCBench Dataset are welcome. If you have suggestions for additional data or improvements, please reach out through the Hugging Face platform or directly contribute via GitHub.
## License 📜
The MMCBench Dataset is made available under the Apache 2.0 License, ensuring open and ethical use for research and development.
## Acknowledgments and Citations 📚
When using the MMCBench Dataset in your research, please cite it appropriately. We extend our gratitude to all contributors and collaborators who have enriched this dataset, making it a valuable resource for the AI and ML community. | javyduck/MMCBench | [
"region:us"
] | 2024-01-22T06:02:41+00:00 | {} | 2024-01-23T05:55:21+00:00 | [] | [] | TAGS
#region-us
| # MMCBench Dataset: Benchmarking Dataset for Multimodal Model Evaluation
## Overview
The MMCBench Dataset is a curated collection of data designed for the comprehensive evaluation of Large Multimodal Models (LMMs) under common corruption scenarios. This dataset supports the MMCBench framework, focusing on cross-modal interactions involving text, image, and speech. It provides essential data for generative tasks such as text-to-image, image-to-text, text-to-speech, and speech-to-text, enabling robustness and self-consistency assessments of LMMs.
## Dataset Composition
The MMCBench Dataset is structured to facilitate the evaluation across four key generative tasks:
- Text-to-Image: A collection of text descriptions with their corresponding corrupted versions and associated images.
- Image-to-Text: A set of images with clean and corrupted captions.
- Text-to-Speech: Text inputs with their clean and corrupted audio outputs.
- Speech-to-Text: Audio files with transcriptions before and after audio corruptions.
Each subset of the dataset has been meticulously selected and processed to represent challenging scenarios for LMMs.
## Using the Dataset ️
To use the MMCBench Dataset for model evaluation:
1. Access the Data: The dataset is hosted on Hugging Face and can be accessed using their dataset library or direct download.
2. Select the Task: Choose from text-to-image, image-to-text, text-to-speech, or speech-to-text tasks based on your model's capabilities.
3. Apply the Benchmark: Utilize the data for each task to test your model's performance against various corruptions. Follow the MMCBench framework for a consistent and standardized evaluation.
### Dataset Structure
The dataset is organized into four main directories, each corresponding to one of the generative tasks:
- 'text2image/': Contains text inputs and associated images.
- 'image2text/': Comprises images and their descriptive captions.
- 'text2speech/': Includes text inputs and generated speech outputs.
- 'speech2text/': Contains audio files and their transcriptions.
## Contributing to the Dataset
Contributions to the MMCBench Dataset are welcome. If you have suggestions for additional data or improvements, please reach out through the Hugging Face platform or directly contribute via GitHub.
## License
The MMCBench Dataset is made available under the Apache 2.0 License, ensuring open and ethical use for research and development.
## Acknowledgments and Citations
When using the MMCBench Dataset in your research, please cite it appropriately. We extend our gratitude to all contributors and collaborators who have enriched this dataset, making it a valuable resource for the AI and ML community. | [
"# MMCBench Dataset: Benchmarking Dataset for Multimodal Model Evaluation",
"## Overview\n\nThe MMCBench Dataset is a curated collection of data designed for the comprehensive evaluation of Large Multimodal Models (LMMs) under common corruption scenarios. This dataset supports the MMCBench framework, focusing on cross-modal interactions involving text, image, and speech. It provides essential data for generative tasks such as text-to-image, image-to-text, text-to-speech, and speech-to-text, enabling robustness and self-consistency assessments of LMMs.",
"## Dataset Composition \n\nThe MMCBench Dataset is structured to facilitate the evaluation across four key generative tasks:\n\n- Text-to-Image: A collection of text descriptions with their corresponding corrupted versions and associated images.\n- Image-to-Text: A set of images with clean and corrupted captions.\n- Text-to-Speech: Text inputs with their clean and corrupted audio outputs.\n- Speech-to-Text: Audio files with transcriptions before and after audio corruptions.\n\nEach subset of the dataset has been meticulously selected and processed to represent challenging scenarios for LMMs.",
"## Using the Dataset ️\n\nTo use the MMCBench Dataset for model evaluation:\n\n1. Access the Data: The dataset is hosted on Hugging Face and can be accessed using their dataset library or direct download.\n2. Select the Task: Choose from text-to-image, image-to-text, text-to-speech, or speech-to-text tasks based on your model's capabilities.\n3. Apply the Benchmark: Utilize the data for each task to test your model's performance against various corruptions. Follow the MMCBench framework for a consistent and standardized evaluation.",
"### Dataset Structure \n\nThe dataset is organized into four main directories, each corresponding to one of the generative tasks:\n\n- 'text2image/': Contains text inputs and associated images.\n- 'image2text/': Comprises images and their descriptive captions.\n- 'text2speech/': Includes text inputs and generated speech outputs.\n- 'speech2text/': Contains audio files and their transcriptions.",
"## Contributing to the Dataset \n\nContributions to the MMCBench Dataset are welcome. If you have suggestions for additional data or improvements, please reach out through the Hugging Face platform or directly contribute via GitHub.",
"## License \n\nThe MMCBench Dataset is made available under the Apache 2.0 License, ensuring open and ethical use for research and development.",
"## Acknowledgments and Citations \n\nWhen using the MMCBench Dataset in your research, please cite it appropriately. We extend our gratitude to all contributors and collaborators who have enriched this dataset, making it a valuable resource for the AI and ML community."
] | [
"TAGS\n#region-us \n",
"# MMCBench Dataset: Benchmarking Dataset for Multimodal Model Evaluation",
"## Overview\n\nThe MMCBench Dataset is a curated collection of data designed for the comprehensive evaluation of Large Multimodal Models (LMMs) under common corruption scenarios. This dataset supports the MMCBench framework, focusing on cross-modal interactions involving text, image, and speech. It provides essential data for generative tasks such as text-to-image, image-to-text, text-to-speech, and speech-to-text, enabling robustness and self-consistency assessments of LMMs.",
"## Dataset Composition \n\nThe MMCBench Dataset is structured to facilitate the evaluation across four key generative tasks:\n\n- Text-to-Image: A collection of text descriptions with their corresponding corrupted versions and associated images.\n- Image-to-Text: A set of images with clean and corrupted captions.\n- Text-to-Speech: Text inputs with their clean and corrupted audio outputs.\n- Speech-to-Text: Audio files with transcriptions before and after audio corruptions.\n\nEach subset of the dataset has been meticulously selected and processed to represent challenging scenarios for LMMs.",
"## Using the Dataset ️\n\nTo use the MMCBench Dataset for model evaluation:\n\n1. Access the Data: The dataset is hosted on Hugging Face and can be accessed using their dataset library or direct download.\n2. Select the Task: Choose from text-to-image, image-to-text, text-to-speech, or speech-to-text tasks based on your model's capabilities.\n3. Apply the Benchmark: Utilize the data for each task to test your model's performance against various corruptions. Follow the MMCBench framework for a consistent and standardized evaluation.",
"### Dataset Structure \n\nThe dataset is organized into four main directories, each corresponding to one of the generative tasks:\n\n- 'text2image/': Contains text inputs and associated images.\n- 'image2text/': Comprises images and their descriptive captions.\n- 'text2speech/': Includes text inputs and generated speech outputs.\n- 'speech2text/': Contains audio files and their transcriptions.",
"## Contributing to the Dataset \n\nContributions to the MMCBench Dataset are welcome. If you have suggestions for additional data or improvements, please reach out through the Hugging Face platform or directly contribute via GitHub.",
"## License \n\nThe MMCBench Dataset is made available under the Apache 2.0 License, ensuring open and ethical use for research and development.",
"## Acknowledgments and Citations \n\nWhen using the MMCBench Dataset in your research, please cite it appropriately. We extend our gratitude to all contributors and collaborators who have enriched this dataset, making it a valuable resource for the AI and ML community."
] |
891d2f2cfe5410467ae69344507b5ecc79a77b21 |
Source: https://universe.roboflow.com/david-bxemt/detecciones
Follow the source license
| brainer/detecciones | [
"region:us"
] | 2024-01-22T06:23:19+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "width", "dtype": "int64"}, {"name": "height", "dtype": "int64"}, {"name": "image_id", "dtype": "int64"}, {"name": "objects", "sequence": [{"name": "area", "dtype": "float64"}, {"name": "bbox", "sequence": "float64"}, {"name": "category", "dtype": "int64"}, {"name": "id", "dtype": "int64"}]}], "splits": [{"name": "train", "num_bytes": 127628464.08, "num_examples": 3330}, {"name": "test", "num_bytes": 7837783.0, "num_examples": 175}, {"name": "valid", "num_bytes": 11554347.0, "num_examples": 301}], "download_size": 125191442, "dataset_size": 147020594.07999998}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}]} | 2024-01-22T08:07:40+00:00 | [] | [] | TAGS
#region-us
|
Source: URL
Follow the source license
| [] | [
"TAGS\n#region-us \n"
] |
177841080f84dcf1a5bea97ee36d4abc1f31e2a9 |
Check out the [paper](https://arxiv.org/abs/2401.13311). | ucla-contextual/contextual_all | [
"license:mit",
"arxiv:2401.13311",
"region:us"
] | 2024-01-22T06:53:25+00:00 | {"license": "mit"} | 2024-02-05T06:39:26+00:00 | [
"2401.13311"
] | [] | TAGS
#license-mit #arxiv-2401.13311 #region-us
|
Check out the paper. | [] | [
"TAGS\n#license-mit #arxiv-2401.13311 #region-us \n"
] |
716afd2ddb2264daa994a8afc54fb5e99f7b3141 | # Dataset Card for "poc_last"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | wisenut-nlp-team/poc_last | [
"region:us"
] | 2024-01-22T07:11:05+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "context", "sequence": "string"}, {"name": "answer", "sequence": "string"}, {"name": "original_answer", "sequence": "string"}, {"name": "similar_contexts", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 49532937707.387, "num_examples": 1908041}, {"name": "validation", "num_bytes": 5254087627.188539, "num_examples": 201427}], "download_size": 27113302745, "dataset_size": 54787025334.57554}} | 2024-01-22T09:04:10+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "poc_last"
More Information needed | [
"# Dataset Card for \"poc_last\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"poc_last\"\n\nMore Information needed"
] |
c4ddcf4b2500a2ccde373c0b05665cb69f24cfbc |
Context
Rice genotype and phenotype data.
Twelve agronomic traits and a hundred simulated traits.
May be useful for GWAS (Genome Wide Association Study) models.
The column name IDs in the trait file map to the following traits.
CUDI_REPRO -> Culm diameter
CULT_REPRO -> Culm length
CUNO_REPRO -> Culm number
GRLT -> Grain length
GRWD -> Grain width
GRWT100 -> Grain weight
HDG_80HEAD -> Heading date
LIGLT -> Ligule length
LLT -> Leaf length
LWD -> Leaf width
PLT_POST -> Panicle length
SDHT -> Seedling height
Acknowledgements
Orhobor, Oghenejokpeme; Alexandrov, Nickolai; Chebotarev, Dmitri; Kretzschmar, Tobias; McNally, Kenneth L.; Sanciangco, Millicent; King, Ross (2018), “Rice genotype and phenotype data.”, Mendeley Data, V1, doi: 10.17632/sr8zzsrpcs.1
Originally posted on Kaggle by SAURABH SHAHANE licensed CC BY 4.0 DEED Attribution 4.0 International | Solshine/Rice_Genotype_and_Phenotype_Data | [
"license:cc-by-sa-4.0",
"region:us"
] | 2024-01-22T07:16:21+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-22T07:25:32+00:00 | [] | [] | TAGS
#license-cc-by-sa-4.0 #region-us
|
Context
Rice genotype and phenotype data.
Twelve agronomic traits and a hundred simulated traits.
May be useful for GWAS (Genome Wide Association Study) models.
The column name IDs in the trait file map to the following traits.
CUDI_REPRO -> Culm diameter
CULT_REPRO -> Culm length
CUNO_REPRO -> Culm number
GRLT -> Grain length
GRWD -> Grain width
GRWT100 -> Grain weight
HDG_80HEAD -> Heading date
LIGLT -> Ligule length
LLT -> Leaf length
LWD -> Leaf width
PLT_POST -> Panicle length
SDHT -> Seedling height
Acknowledgements
Orhobor, Oghenejokpeme; Alexandrov, Nickolai; Chebotarev, Dmitri; Kretzschmar, Tobias; McNally, Kenneth L.; Sanciangco, Millicent; King, Ross (2018), “Rice genotype and phenotype data.”, Mendeley Data, V1, doi: 10.17632/sr8zzsrpcs.1
Originally posted on Kaggle by SAURABH SHAHANE licensed CC BY 4.0 DEED Attribution 4.0 International | [] | [
"TAGS\n#license-cc-by-sa-4.0 #region-us \n"
] |
e93227baec94ab4116a90d432978ee7696d50239 |
Context:
This data originally came from the College of Agriculture and Forestry
Originally posted to Kaggle by AGRICULTURAL INNOVATIONS with the following description
"Precision agriculture is in trend nowadays. It helps the farmers to get informed decision about the farming strategy. Here, we present to you a dataset which would allow the users to build a predictive model to recommend the most suitable crops to grow in a particular farm based on various parameters."
Includes recommendations for the following needs of plants:
N,
P,
K,
temperature,
humidity,
ph,
rainfall
This may also be useful for training models in reccomending nutrition for crops based on environmental conditions. | Solshine/CollegeOfAgricultureAndForestry_Agricultural_Crop_Dataset | [
"license:cc",
"region:us"
] | 2024-01-22T07:28:34+00:00 | {"license": "cc"} | 2024-01-23T04:42:21+00:00 | [] | [] | TAGS
#license-cc #region-us
|
Context:
This data originally came from the College of Agriculture and Forestry
Originally posted to Kaggle by AGRICULTURAL INNOVATIONS with the following description
"Precision agriculture is in trend nowadays. It helps the farmers to get informed decision about the farming strategy. Here, we present to you a dataset which would allow the users to build a predictive model to recommend the most suitable crops to grow in a particular farm based on various parameters."
Includes recommendations for the following needs of plants:
N,
P,
K,
temperature,
humidity,
ph,
rainfall
This may also be useful for training models in reccomending nutrition for crops based on environmental conditions. | [] | [
"TAGS\n#license-cc #region-us \n"
] |
7d3443b863cd2758398e44025368027d030fb65d |
# Dataset Card for Evaluation run of FelixChao/Sirius-10B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/Sirius-10B](https://huggingface.co/FelixChao/Sirius-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__Sirius-10B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T07:26:29.480473](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Sirius-10B/blob/main/results_2024-01-22T07-26-29.480473.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6512873434629626,
"acc_stderr": 0.032166651969407094,
"acc_norm": 0.6523094390168064,
"acc_norm_stderr": 0.03282308883774373,
"mc1": 0.5226438188494492,
"mc1_stderr": 0.017485542258489643,
"mc2": 0.6810112131261441,
"mc2_stderr": 0.015016502423502063
},
"harness|arc:challenge|25": {
"acc": 0.6928327645051194,
"acc_stderr": 0.013481034054980943,
"acc_norm": 0.7192832764505119,
"acc_norm_stderr": 0.013131238126975576
},
"harness|hellaswag|10": {
"acc": 0.6930890260904202,
"acc_stderr": 0.004602695416756988,
"acc_norm": 0.8732324238199561,
"acc_norm_stderr": 0.0033203245481454044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45363128491620114,
"acc_stderr": 0.016650437588269073,
"acc_norm": 0.45363128491620114,
"acc_norm_stderr": 0.016650437588269073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598564,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598564
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5226438188494492,
"mc1_stderr": 0.017485542258489643,
"mc2": 0.6810112131261441,
"mc2_stderr": 0.015016502423502063
},
"harness|winogrande|5": {
"acc": 0.8279400157853196,
"acc_stderr": 0.010607731615247003
},
"harness|gsm8k|5": {
"acc": 0.6209249431387415,
"acc_stderr": 0.01336363029508835
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__Sirius-10B | [
"region:us"
] | 2024-01-22T07:28:44+00:00 | {"pretty_name": "Evaluation run of FelixChao/Sirius-10B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Sirius-10B](https://huggingface.co/FelixChao/Sirius-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Sirius-10B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T07:26:29.480473](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Sirius-10B/blob/main/results_2024-01-22T07-26-29.480473.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6512873434629626,\n \"acc_stderr\": 0.032166651969407094,\n \"acc_norm\": 0.6523094390168064,\n \"acc_norm_stderr\": 0.03282308883774373,\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.017485542258489643,\n \"mc2\": 0.6810112131261441,\n \"mc2_stderr\": 0.015016502423502063\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6928327645051194,\n \"acc_stderr\": 0.013481034054980943,\n \"acc_norm\": 0.7192832764505119,\n \"acc_norm_stderr\": 0.013131238126975576\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6930890260904202,\n \"acc_stderr\": 0.004602695416756988,\n \"acc_norm\": 0.8732324238199561,\n \"acc_norm_stderr\": 0.0033203245481454044\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45363128491620114,\n \"acc_stderr\": 0.016650437588269073,\n \"acc_norm\": 0.45363128491620114,\n \"acc_norm_stderr\": 0.016650437588269073\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.012729785386598564,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.012729785386598564\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.017485542258489643,\n \"mc2\": 0.6810112131261441,\n \"mc2_stderr\": 0.015016502423502063\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247003\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6209249431387415,\n \"acc_stderr\": 0.01336363029508835\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Sirius-10B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|arc:challenge|25_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|gsm8k|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hellaswag|10_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T07-26-29.480473.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["**/details_harness|winogrande|5_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T07-26-29.480473.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T07_26_29.480473", "path": ["results_2024-01-22T07-26-29.480473.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T07-26-29.480473.parquet"]}]}]} | 2024-01-22T07:29:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/Sirius-10B
Dataset automatically created during the evaluation run of model FelixChao/Sirius-10B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T07:26:29.480473(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FelixChao/Sirius-10B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Sirius-10B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T07:26:29.480473(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/Sirius-10B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Sirius-10B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T07:26:29.480473(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
37608025af45057383b4dc6eef06c3a43799150a |
# Dataset Card for Evaluation run of Eurdem/Voltran-1.0-MoE-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eurdem/Voltran-1.0-MoE-2x7B](https://huggingface.co/Eurdem/Voltran-1.0-MoE-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T07:49:54.062079](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B/blob/main/results_2024-01-22T07-49-54.062079.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6144636049773026,
"acc_stderr": 0.033017267421085336,
"acc_norm": 0.6168693020696908,
"acc_norm_stderr": 0.033678924264574035,
"mc1": 0.408812729498164,
"mc1_stderr": 0.01720995215164173,
"mc2": 0.5748009213372511,
"mc2_stderr": 0.015610411040968409
},
"harness|arc:challenge|25": {
"acc": 0.5955631399317406,
"acc_stderr": 0.014342036483436179,
"acc_norm": 0.6407849829351536,
"acc_norm_stderr": 0.014020224155839162
},
"harness|hellaswag|10": {
"acc": 0.6444931288587931,
"acc_stderr": 0.004776883632722614,
"acc_norm": 0.837382991435969,
"acc_norm_stderr": 0.0036826171219143085
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644823,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.02822949732031721,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.02822949732031721
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940788,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940788
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739152,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695063,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242836,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495144,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5323383084577115,
"acc_stderr": 0.03528131472933607,
"acc_norm": 0.5323383084577115,
"acc_norm_stderr": 0.03528131472933607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.408812729498164,
"mc1_stderr": 0.01720995215164173,
"mc2": 0.5748009213372511,
"mc2_stderr": 0.015610411040968409
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
},
"harness|gsm8k|5": {
"acc": 0.5595147839272175,
"acc_stderr": 0.013674572131693888
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B | [
"region:us"
] | 2024-01-22T07:52:07+00:00 | {"pretty_name": "Evaluation run of Eurdem/Voltran-1.0-MoE-2x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Eurdem/Voltran-1.0-MoE-2x7B](https://huggingface.co/Eurdem/Voltran-1.0-MoE-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T07:49:54.062079](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B/blob/main/results_2024-01-22T07-49-54.062079.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6144636049773026,\n \"acc_stderr\": 0.033017267421085336,\n \"acc_norm\": 0.6168693020696908,\n \"acc_norm_stderr\": 0.033678924264574035,\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5748009213372511,\n \"mc2_stderr\": 0.015610411040968409\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5955631399317406,\n \"acc_stderr\": 0.014342036483436179,\n \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839162\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6444931288587931,\n \"acc_stderr\": 0.004776883632722614,\n \"acc_norm\": 0.837382991435969,\n \"acc_norm_stderr\": 0.0036826171219143085\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644823,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644823\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n \"acc_stderr\": 0.02822949732031721,\n \"acc_norm\": 0.5612903225806452,\n \"acc_norm_stderr\": 0.02822949732031721\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940788,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940788\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739152,\n \"acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739152\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695063,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695063\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242836,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242836\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495144,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495144\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5748009213372511,\n \"mc2_stderr\": 0.015610411040968409\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5595147839272175,\n \"acc_stderr\": 0.013674572131693888\n }\n}\n```", "repo_url": "https://huggingface.co/Eurdem/Voltran-1.0-MoE-2x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|arc:challenge|25_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|gsm8k|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hellaswag|10_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T07-49-54.062079.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["**/details_harness|winogrande|5_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T07-49-54.062079.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T07_49_54.062079", "path": ["results_2024-01-22T07-49-54.062079.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T07-49-54.062079.parquet"]}]}]} | 2024-01-22T07:52:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Eurdem/Voltran-1.0-MoE-2x7B
Dataset automatically created during the evaluation run of model Eurdem/Voltran-1.0-MoE-2x7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T07:49:54.062079(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Eurdem/Voltran-1.0-MoE-2x7B\n\n\n\nDataset automatically created during the evaluation run of model Eurdem/Voltran-1.0-MoE-2x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T07:49:54.062079(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Eurdem/Voltran-1.0-MoE-2x7B\n\n\n\nDataset automatically created during the evaluation run of model Eurdem/Voltran-1.0-MoE-2x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T07:49:54.062079(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3c2c9fe9e5785e6d81f5510bf4639e7e5c9d2cb0 | # Dataset Card for "one-summary-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mtc/one-summary-test | [
"region:us"
] | 2024-01-22T08:01:11+00:00 | {"dataset_info": {"features": [{"name": "document", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5018, "num_examples": 4}], "download_size": 0, "dataset_size": 5018}} | 2024-01-24T12:10:45+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "one-summary-test"
More Information needed | [
"# Dataset Card for \"one-summary-test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"one-summary-test\"\n\nMore Information needed"
] |
4c46ec3a9ce0d63521f23304653d6572b9d4e681 |
About Dataset
Context
The National Park Service publishes a database of animal and plant species identified in individual national parks and verified by evidence — observations, vouchers, or reports that document the presence of a species in a park. All park species records are available to the public on the National Park Species portal; exceptions are made for sensitive, threatened, or endangered species when widespread distribution of information could pose a risk to the species in the park.
Content
National Park species lists provide information on the presence and status of species in our national parks. These species lists are works in progress and the absence of a species from a list does not necessarily mean the species is absent from a park. The time and effort spent on species inventories varies from park to park, which may result in data gaps. Species taxonomy changes over time and reflects regional variations or preferences; therefore, records may be listed under a different species name.
Each park species record includes a species ID, park name, taxonomic information, scientific name, one or more common names, record status, occurrence (verification of species presence in park), nativeness (species native or foreign to park), abundance (presence and visibility of species in park), seasonality (season and nature of presence in park), and conservation status (species classification according to US Fish & Wildlife Service). Taxonomic classes have been translated from Latin to English for species categorization; order, family, and scientific name (genus, species, subspecies) are in Latin.
Acknowledgements
The National Park Service species list database is managed and updated by staff at individual national parks and the systemwide Inventory and Monitoring department.
Source: https://irma.nps.gov/NPSpecies
Also available on Kaggle: https://www.kaggle.com/datasets/nationalparkservice/park-biodiversity
Users interested in getting this data via web services, please go to: http://irmaservices.nps.gov | Solshine/Biodiversity_In_National_Parks | [
"license:cc",
"region:us"
] | 2024-01-22T08:08:19+00:00 | {"license": "cc"} | 2024-01-22T08:11:25+00:00 | [] | [] | TAGS
#license-cc #region-us
|
About Dataset
Context
The National Park Service publishes a database of animal and plant species identified in individual national parks and verified by evidence — observations, vouchers, or reports that document the presence of a species in a park. All park species records are available to the public on the National Park Species portal; exceptions are made for sensitive, threatened, or endangered species when widespread distribution of information could pose a risk to the species in the park.
Content
National Park species lists provide information on the presence and status of species in our national parks. These species lists are works in progress and the absence of a species from a list does not necessarily mean the species is absent from a park. The time and effort spent on species inventories varies from park to park, which may result in data gaps. Species taxonomy changes over time and reflects regional variations or preferences; therefore, records may be listed under a different species name.
Each park species record includes a species ID, park name, taxonomic information, scientific name, one or more common names, record status, occurrence (verification of species presence in park), nativeness (species native or foreign to park), abundance (presence and visibility of species in park), seasonality (season and nature of presence in park), and conservation status (species classification according to US Fish & Wildlife Service). Taxonomic classes have been translated from Latin to English for species categorization; order, family, and scientific name (genus, species, subspecies) are in Latin.
Acknowledgements
The National Park Service species list database is managed and updated by staff at individual national parks and the systemwide Inventory and Monitoring department.
Source: URL
Also available on Kaggle: URL
Users interested in getting this data via web services, please go to: URL | [] | [
"TAGS\n#license-cc #region-us \n"
] |
efa78cc2f74bbcd21eff2261f9e13aebe40b814e | Mintaka: A Complex, Natural, and Multilingual Dataset for End-to-End Question Answering
https://github.com/amazon-science/mintaka
We only took entity-type answers and avoided answers that were only numbers or booleans
```
@inproceedings{sen-etal-2022-mintaka,
title = "Mintaka: A Complex, Natural, and Multilingual Dataset for End-to-End Question Answering",
author = "Sen, Priyanka and
Aji, Alham Fikri and
Saffari, Amir",
booktitle = "Proceedings of the 29th International Conference on Computational Linguistics",
month = oct,
year = "2022",
address = "Gyeongju, Republic of Korea",
publisher = "International Committee on Computational Linguistics",
url = "https://aclanthology.org/2022.coling-1.138",
pages = "1604--1619"
}
``` | jinaai/mintakaqa | [
"region:eu"
] | 2024-01-22T08:21:58+00:00 | {} | 2024-01-22T13:03:06+00:00 | [] | [] | TAGS
#region-eu
| Mintaka: A Complex, Natural, and Multilingual Dataset for End-to-End Question Answering
URL
We only took entity-type answers and avoided answers that were only numbers or booleans
| [] | [
"TAGS\n#region-eu \n"
] |
0afdbedeefdfb34e328fea3c0ed68c7bdecf13a1 | # Dataset Card for "one-document-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mtc/one-document-test | [
"region:us"
] | 2024-01-22T08:26:02+00:00 | {"dataset_info": {"features": [{"name": "document", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18457, "num_examples": 4}], "download_size": 19435, "dataset_size": 18457}} | 2024-01-22T14:31:25+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "one-document-test"
More Information needed | [
"# Dataset Card for \"one-document-test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"one-document-test\"\n\nMore Information needed"
] |
b00204c55460c28b25eac841a3de8d0b01a488a7 |
This is a dataset created using [vector-io](https://github.com/ai-northstar-tech/vector-io)
| dhruv-anand-aintech/vdf_20240122_140004_c932b | [
"vdf",
"vector-io",
"vector-dataset",
"vector-embeddings",
"region:us"
] | 2024-01-22T08:30:23+00:00 | {"tags": ["vdf", "vector-io", "vector-dataset", "vector-embeddings"]} | 2024-01-22T08:30:31+00:00 | [] | [] | TAGS
#vdf #vector-io #vector-dataset #vector-embeddings #region-us
|
This is a dataset created using vector-io
| [] | [
"TAGS\n#vdf #vector-io #vector-dataset #vector-embeddings #region-us \n"
] |
161fd3ce40f6fad4c57ecee9c51d72d8fc4a3434 | This dataset is for VinT_Bench: Benchmarking the Object-in-hand Pose from Vision, Touch, and Proproception.
Senlin update the vint-sim, Zhaoliang update the vint-real | Jeffreyzhaoliang/vint-bench | [
"license:mit",
"region:us"
] | 2024-01-22T08:31:47+00:00 | {"license": "mit"} | 2024-02-06T05:59:30+00:00 | [] | [] | TAGS
#license-mit #region-us
| This dataset is for VinT_Bench: Benchmarking the Object-in-hand Pose from Vision, Touch, and Proproception.
Senlin update the vint-sim, Zhaoliang update the vint-real | [] | [
"TAGS\n#license-mit #region-us \n"
] |
680bd6217a6a1eb9f7b20418fe58506a3afe61d3 | # Animagine XL 3.0 Character
[EasySdxlWebUi](https://github.com/Zuntan03/EasySdxlWebUi) による [Animagine XL 3.0](https://huggingface.co/cagliostrolab/animagine-xl-3.0) の [公式 Character ワイルドカード](https://huggingface.co/spaces/Linaqruf/animagine-xl/resolve/main/wildcard/character.txt) の立ち絵データセットです。
データセットのダウンロードは [こちら(2880枚、497MB)](https://huggingface.co/datasets/Zuntan/Animagine_XL_3.0-Character/resolve/main/character.zip?download=true)。
**[表情(278MB)](https://huggingface.co/datasets/Zuntan/Animagine_XL_3.0-Character/resolve/main/face.zip?download=true) と [画風(115MB)](https://yyy.wpx.jp/EasySdxlWebUi/style.zip) も用意しました。**

画像の類似度や Tagger の結果比較で正常動作するワイルドカードリストを用意できないかな?と思って始めてみました。
が、衣装違いなどの不正解画像でも作品名やキャラ名の影響を大きく受けるため、他のソースなしの正否分類は難しそうです。
- 各 webp 画像を [Stable Diffusion web UI](https://github.com/AUTOMATIC1111/stable-diffusion-webui) の `PNG内の情報を表示` にドラッグ&ドロップすると生成情報を確認できます。
- プロンプトは `__animagine/character__, solo, full body, standing, no background, simple background, masterpiece, best quality <lora:lcm-animagine-3:1>` です。
- ネガティブプロンプト Animagine XL のデフォルトネガティブの先頭に NSFW 対策付与で `nsfw, rating: sensitive, lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name` です。
- アップスケール前の生成サイズは `832` x `1216` です。
- Seed は `1234567` です。
- 他のシードで正否が変わる可能性があります。
- 他は EasySdxlWebUi のデフォルト設定です。
[grid0](https://yyy.wpx.jp/m/202401/animagine_character/grid0.webp),
[grid1](https://yyy.wpx.jp/m/202401/animagine_character/grid1.webp),
[grid2](https://yyy.wpx.jp/m/202401/animagine_character/grid2.webp),
[grid3](https://yyy.wpx.jp/m/202401/animagine_character/grid3.webp)
| Zuntan/Animagine_XL_3.0-Character | [
"license:unknown",
"region:us"
] | 2024-01-22T08:43:04+00:00 | {"license": "unknown"} | 2024-01-26T09:19:08+00:00 | [] | [] | TAGS
#license-unknown #region-us
| # Animagine XL 3.0 Character
EasySdxlWebUi による Animagine XL 3.0 の 公式 Character ワイルドカード の立ち絵データセットです。
データセットのダウンロードは こちら(2880枚、497MB)。
表情(278MB) と 画風(115MB) も用意しました。
!face
画像の類似度や Tagger の結果比較で正常動作するワイルドカードリストを用意できないかな?と思って始めてみました。
が、衣装違いなどの不正解画像でも作品名やキャラ名の影響を大きく受けるため、他のソースなしの正否分類は難しそうです。
- 各 webp 画像を Stable Diffusion web UI の 'PNG内の情報を表示' にドラッグ&ドロップすると生成情報を確認できます。
- プロンプトは '__animagine/character__, solo, full body, standing, no background, simple background, masterpiece, best quality <lora:lcm-animagine-3:1>' です。
- ネガティブプロンプト Animagine XL のデフォルトネガティブの先頭に NSFW 対策付与で 'nsfw, rating: sensitive, lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name' です。
- アップスケール前の生成サイズは '832' x '1216' です。
- Seed は '1234567' です。
- 他のシードで正否が変わる可能性があります。
- 他は EasySdxlWebUi のデフォルト設定です。
grid0,
grid1,
grid2,
grid3
| [
"# Animagine XL 3.0 Character\n\nEasySdxlWebUi による Animagine XL 3.0 の 公式 Character ワイルドカード の立ち絵データセットです。\n\nデータセットのダウンロードは こちら(2880枚、497MB)。 \n表情(278MB) と 画風(115MB) も用意しました。\n\n!face\n\n画像の類似度や Tagger の結果比較で正常動作するワイルドカードリストを用意できないかな?と思って始めてみました。 \nが、衣装違いなどの不正解画像でも作品名やキャラ名の影響を大きく受けるため、他のソースなしの正否分類は難しそうです。\n\n- 各 webp 画像を Stable Diffusion web UI の 'PNG内の情報を表示' にドラッグ&ドロップすると生成情報を確認できます。\n- プロンプトは '__animagine/character__, solo, full body, standing, no background, simple background, masterpiece, best quality <lora:lcm-animagine-3:1>' です。\n- ネガティブプロンプト Animagine XL のデフォルトネガティブの先頭に NSFW 対策付与で 'nsfw, rating: sensitive, lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name' です。\n- アップスケール前の生成サイズは '832' x '1216' です。\n- Seed は '1234567' です。\n\t- 他のシードで正否が変わる可能性があります。\n- 他は EasySdxlWebUi のデフォルト設定です。\n\ngrid0, \ngrid1, \ngrid2, \ngrid3"
] | [
"TAGS\n#license-unknown #region-us \n",
"# Animagine XL 3.0 Character\n\nEasySdxlWebUi による Animagine XL 3.0 の 公式 Character ワイルドカード の立ち絵データセットです。\n\nデータセットのダウンロードは こちら(2880枚、497MB)。 \n表情(278MB) と 画風(115MB) も用意しました。\n\n!face\n\n画像の類似度や Tagger の結果比較で正常動作するワイルドカードリストを用意できないかな?と思って始めてみました。 \nが、衣装違いなどの不正解画像でも作品名やキャラ名の影響を大きく受けるため、他のソースなしの正否分類は難しそうです。\n\n- 各 webp 画像を Stable Diffusion web UI の 'PNG内の情報を表示' にドラッグ&ドロップすると生成情報を確認できます。\n- プロンプトは '__animagine/character__, solo, full body, standing, no background, simple background, masterpiece, best quality <lora:lcm-animagine-3:1>' です。\n- ネガティブプロンプト Animagine XL のデフォルトネガティブの先頭に NSFW 対策付与で 'nsfw, rating: sensitive, lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name' です。\n- アップスケール前の生成サイズは '832' x '1216' です。\n- Seed は '1234567' です。\n\t- 他のシードで正否が変わる可能性があります。\n- 他は EasySdxlWebUi のデフォルト設定です。\n\ngrid0, \ngrid1, \ngrid2, \ngrid3"
] |
c99d599f0a6ab9b85b065da6f9d94f9cf731679f |
xPQA is a large-scale annotated cross-lingual Product QA dataset
https://arxiv.org/abs/2305.09249
https://github.com/amazon-science/contextual-product-qa?tab=readme-ov-file#xpqa
```
@article{shen2023xpqa,
title={xPQA: Cross-Lingual Product Question Answering across 12 Languages},
author={Shen, Xiaoyu and Asai, Akari and Byrne, Bill and de Gispert, Adri{\`a}},
journal={arXiv preprint arXiv:2305.09249},
year={2023}
}
``` | jinaai/xpqa | [
"arxiv:2305.09249",
"region:eu"
] | 2024-01-22T08:51:01+00:00 | {} | 2024-01-22T13:04:24+00:00 | [
"2305.09249"
] | [] | TAGS
#arxiv-2305.09249 #region-eu
|
xPQA is a large-scale annotated cross-lingual Product QA dataset
URL
URL
| [] | [
"TAGS\n#arxiv-2305.09249 #region-eu \n"
] |
81120515b20c8b0246c6e7b517a540d96a22a871 |
- max count_word cluster_1: 1722
- min count_word cluster_1: 11
- max count_word cluster_2: 2624
- min count_word cluster_2: 21
- max count_word cluster_3: 2370
- min count_word cluster_3: 31
```Python
DatasetDict({
Cluster_1: Dataset({
features: ['Text', 'Cluster', 'Polarity', 'count_word'],
num_rows: 4797
})
Cluster_2: Dataset({
features: ['Text', 'Cluster', 'Polarity', 'count_word'],
num_rows: 4025
})
Cluster_3: Dataset({
features: ['Text', 'Cluster', 'Polarity', 'count_word'],
num_rows: 5026
})
})
```


| NickyNicky/oasst2_clusters | [
"language:en",
"language:es",
"language:ru",
"language:zh",
"language:de",
"language:fr",
"language:th",
"language:ca",
"language:it",
"language:ja",
"language:pl",
"language:eo",
"language:eu",
"language:vi",
"language:fi",
"language:hu",
"language:ar",
"language:nl",
"language:da",
"language:tr",
"language:ko",
"language:he",
"language:id",
"language:cs",
"language:bn",
"language:sv",
"region:us"
] | 2024-01-22T08:52:53+00:00 | {"language": ["en", "es", "ru", "zh", "de", "fr", "th", "ca", "it", "ja", "pl", "eo", "eu", "vi", "fi", "hu", "ar", "nl", "da", "tr", "ko", "he", "id", "cs", "bn", "sv"], "dataset_info": {"features": [{"name": "Text", "dtype": "string"}, {"name": "Cluster", "dtype": "int32"}, {"name": "Polarity", "dtype": "float64"}, {"name": "count_word", "dtype": "int64"}], "splits": [{"name": "Cluster_1", "num_bytes": 11487341, "num_examples": 4797}, {"name": "Cluster_2", "num_bytes": 8423711, "num_examples": 4025}, {"name": "Cluster_3", "num_bytes": 16002250, "num_examples": 5026}], "download_size": 18951480, "dataset_size": 35913302}, "configs": [{"config_name": "default", "data_files": [{"split": "Cluster_1", "path": "data/Cluster_1-*"}, {"split": "Cluster_2", "path": "data/Cluster_2-*"}, {"split": "Cluster_3", "path": "data/Cluster_3-*"}]}]} | 2024-01-26T13:16:49+00:00 | [] | [
"en",
"es",
"ru",
"zh",
"de",
"fr",
"th",
"ca",
"it",
"ja",
"pl",
"eo",
"eu",
"vi",
"fi",
"hu",
"ar",
"nl",
"da",
"tr",
"ko",
"he",
"id",
"cs",
"bn",
"sv"
] | TAGS
#language-English #language-Spanish #language-Russian #language-Chinese #language-German #language-French #language-Thai #language-Catalan #language-Italian #language-Japanese #language-Polish #language-Esperanto #language-Basque #language-Vietnamese #language-Finnish #language-Hungarian #language-Arabic #language-Dutch #language-Danish #language-Turkish #language-Korean #language-Hebrew #language-Indonesian #language-Czech #language-Bengali #language-Swedish #region-us
|
- max count_word cluster_1: 1722
- min count_word cluster_1: 11
- max count_word cluster_2: 2624
- min count_word cluster_2: 21
- max count_word cluster_3: 2370
- min count_word cluster_3: 31
!image/png
!image/png
| [] | [
"TAGS\n#language-English #language-Spanish #language-Russian #language-Chinese #language-German #language-French #language-Thai #language-Catalan #language-Italian #language-Japanese #language-Polish #language-Esperanto #language-Basque #language-Vietnamese #language-Finnish #language-Hungarian #language-Arabic #language-Dutch #language-Danish #language-Turkish #language-Korean #language-Hebrew #language-Indonesian #language-Czech #language-Bengali #language-Swedish #region-us \n"
] |
cce93bc1b608facec8cf883f2369968b48fffc59 |
# Dataset Card for Evaluation run of senseable/WestLake-7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [senseable/WestLake-7B-v2](https://huggingface.co/senseable/WestLake-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_senseable__WestLake-7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T09:52:08.185697](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__WestLake-7B-v2/blob/main/results_2024-01-22T09-52-08.185697.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6533732034583493,
"acc_stderr": 0.03209755561849309,
"acc_norm": 0.652581699527691,
"acc_norm_stderr": 0.03277838192989007,
"mc1": 0.5422276621787026,
"mc1_stderr": 0.017440965712482125,
"mc2": 0.6706202401619532,
"mc2_stderr": 0.015393271752873241
},
"harness|arc:challenge|25": {
"acc": 0.7047781569965871,
"acc_stderr": 0.013329750293382318,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.01296804068686914
},
"harness|hellaswag|10": {
"acc": 0.7194781915952997,
"acc_stderr": 0.0044833603701405775,
"acc_norm": 0.8864767974507071,
"acc_norm_stderr": 0.0031658294884891794
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382186,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382186
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218974,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218974
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974333,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974333
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616325,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616325
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867454,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867454
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.016619881988177015,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.016619881988177015
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781873,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5422276621787026,
"mc1_stderr": 0.017440965712482125,
"mc2": 0.6706202401619532,
"mc2_stderr": 0.015393271752873241
},
"harness|winogrande|5": {
"acc": 0.8697711128650355,
"acc_stderr": 0.009458870979028597
},
"harness|gsm8k|5": {
"acc": 0.6762699014404853,
"acc_stderr": 0.012888247397371141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_senseable__WestLake-7B-v2 | [
"region:us"
] | 2024-01-22T09:54:32+00:00 | {"pretty_name": "Evaluation run of senseable/WestLake-7B-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [senseable/WestLake-7B-v2](https://huggingface.co/senseable/WestLake-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_senseable__WestLake-7B-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T09:52:08.185697](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__WestLake-7B-v2/blob/main/results_2024-01-22T09-52-08.185697.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533732034583493,\n \"acc_stderr\": 0.03209755561849309,\n \"acc_norm\": 0.652581699527691,\n \"acc_norm_stderr\": 0.03277838192989007,\n \"mc1\": 0.5422276621787026,\n \"mc1_stderr\": 0.017440965712482125,\n \"mc2\": 0.6706202401619532,\n \"mc2_stderr\": 0.015393271752873241\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7047781569965871,\n \"acc_stderr\": 0.013329750293382318,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.01296804068686914\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7194781915952997,\n \"acc_stderr\": 0.0044833603701405775,\n \"acc_norm\": 0.8864767974507071,\n \"acc_norm_stderr\": 0.0031658294884891794\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382186,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382186\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218974,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218974\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974333,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974333\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616325,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616325\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.019875655027867454,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.019875655027867454\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n \"acc_stderr\": 0.016619881988177015,\n \"acc_norm\": 0.4446927374301676,\n \"acc_norm_stderr\": 0.016619881988177015\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781873,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5422276621787026,\n \"mc1_stderr\": 0.017440965712482125,\n \"mc2\": 0.6706202401619532,\n \"mc2_stderr\": 0.015393271752873241\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8697711128650355,\n \"acc_stderr\": 0.009458870979028597\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6762699014404853,\n \"acc_stderr\": 0.012888247397371141\n }\n}\n```", "repo_url": "https://huggingface.co/senseable/WestLake-7B-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|arc:challenge|25_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|gsm8k|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hellaswag|10_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-08.185697.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["**/details_harness|winogrande|5_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T09-52-08.185697.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T09_52_08.185697", "path": ["results_2024-01-22T09-52-08.185697.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T09-52-08.185697.parquet"]}]}]} | 2024-01-22T09:54:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of senseable/WestLake-7B-v2
Dataset automatically created during the evaluation run of model senseable/WestLake-7B-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T09:52:08.185697(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of senseable/WestLake-7B-v2\n\n\n\nDataset automatically created during the evaluation run of model senseable/WestLake-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T09:52:08.185697(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of senseable/WestLake-7B-v2\n\n\n\nDataset automatically created during the evaluation run of model senseable/WestLake-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T09:52:08.185697(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c21f00b2a4f210555acac8f1cc7c0fbadbdf788c |
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-dpo-e1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T09:52:31.786673](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-dpo-e1/blob/main/results_2024-01-22T09-52-31.786673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6062600204892812,
"acc_stderr": 0.03315436827084122,
"acc_norm": 0.6105005647485784,
"acc_norm_stderr": 0.033825828468753205,
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.7076437205804561,
"mc2_stderr": 0.015031924672941057
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.01435639941800912,
"acc_norm": 0.6271331058020477,
"acc_norm_stderr": 0.014131176760131172
},
"harness|hellaswag|10": {
"acc": 0.6742680740888269,
"acc_stderr": 0.004676898861978911,
"acc_norm": 0.8530173272256523,
"acc_norm_stderr": 0.003533649851728493
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.039531733777491945,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.039531733777491945
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726367,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726367
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.0250437573185202,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.0250437573185202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.603225806451613,
"acc_stderr": 0.027831231605767948,
"acc_norm": 0.603225806451613,
"acc_norm_stderr": 0.027831231605767948
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296532,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296532
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176085,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176085
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114969,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114969
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934486,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29720670391061454,
"acc_stderr": 0.015285313353641602,
"acc_norm": 0.29720670391061454,
"acc_norm_stderr": 0.015285313353641602
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906508,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906508
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455333,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.019524316744866356,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.019524316744866356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.032357437893550424,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.032357437893550424
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.7076437205804561,
"mc2_stderr": 0.015031924672941057
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.01180736022402539
},
"harness|gsm8k|5": {
"acc": 0.4040940106141016,
"acc_stderr": 0.013516752972721717
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-dpo-e1 | [
"region:us"
] | 2024-01-22T09:54:48+00:00 | {"pretty_name": "Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1", "dataset_summary": "Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-dpo-e1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T09:52:31.786673](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-dpo-e1/blob/main/results_2024-01-22T09-52-31.786673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6062600204892812,\n \"acc_stderr\": 0.03315436827084122,\n \"acc_norm\": 0.6105005647485784,\n \"acc_norm_stderr\": 0.033825828468753205,\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.7076437205804561,\n \"mc2_stderr\": 0.015031924672941057\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.014131176760131172\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6742680740888269,\n \"acc_stderr\": 0.004676898861978911,\n \"acc_norm\": 0.8530173272256523,\n \"acc_norm_stderr\": 0.003533649851728493\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.039531733777491945,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.039531733777491945\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.603225806451613,\n \"acc_stderr\": 0.027831231605767948,\n \"acc_norm\": 0.603225806451613,\n \"acc_norm_stderr\": 0.027831231605767948\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176085,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176085\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n \"acc_stderr\": 0.014774358319934486,\n \"acc_norm\": 0.7816091954022989,\n \"acc_norm_stderr\": 0.014774358319934486\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29720670391061454,\n \"acc_stderr\": 0.015285313353641602,\n \"acc_norm\": 0.29720670391061454,\n \"acc_norm_stderr\": 0.015285313353641602\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906508,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906508\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n \"acc_stderr\": 0.012665568135455333,\n \"acc_norm\": 0.4361147327249022,\n \"acc_norm_stderr\": 0.012665568135455333\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866356,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866356\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.032357437893550424,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.032357437893550424\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.7076437205804561,\n \"mc2_stderr\": 0.015031924672941057\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.01180736022402539\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4040940106141016,\n \"acc_stderr\": 0.013516752972721717\n }\n}\n```", "repo_url": "https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|arc:challenge|25_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|gsm8k|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hellaswag|10_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-31.786673.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["**/details_harness|winogrande|5_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T09-52-31.786673.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T09_52_31.786673", "path": ["results_2024-01-22T09-52-31.786673.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T09-52-31.786673.parquet"]}]}]} | 2024-01-22T09:55:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1
Dataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T09:52:31.786673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T09:52:31.786673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-dpo-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T09:52:31.786673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4b77479ea2f664289eedf3b19ac06ef19a8d8ec6 |
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T09:52:53.024904](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e1/blob/main/results_2024-01-22T09-52-53.024904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6068306303971267,
"acc_stderr": 0.033128919989688366,
"acc_norm": 0.611125593463237,
"acc_norm_stderr": 0.03379981124271015,
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513704,
"mc2": 0.7056173366389187,
"mc2_stderr": 0.015050090065464976
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449701,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6751643098984266,
"acc_stderr": 0.004673563250946101,
"acc_norm": 0.852320254929297,
"acc_norm_stderr": 0.0035405716545956313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.039531733777491945,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.039531733777491945
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.027791878753132274,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.027791878753132274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082393,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082393
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371151,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371151
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.01536686038639711,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.01536686038639711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115326,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115326
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.032357437893550424,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.032357437893550424
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513704,
"mc2": 0.7056173366389187,
"mc2_stderr": 0.015050090065464976
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836675
},
"harness|gsm8k|5": {
"acc": 0.400303260045489,
"acc_stderr": 0.013495926436566438
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e1 | [
"region:us"
] | 2024-01-22T09:55:14+00:00 | {"pretty_name": "Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1", "dataset_summary": "Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T09:52:53.024904](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e1/blob/main/results_2024-01-22T09-52-53.024904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6068306303971267,\n \"acc_stderr\": 0.033128919989688366,\n \"acc_norm\": 0.611125593463237,\n \"acc_norm_stderr\": 0.03379981124271015,\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.017374520482513704,\n \"mc2\": 0.7056173366389187,\n \"mc2_stderr\": 0.015050090065464976\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449701,\n \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6751643098984266,\n \"acc_stderr\": 0.004673563250946101,\n \"acc_norm\": 0.852320254929297,\n \"acc_norm_stderr\": 0.0035405716545956313\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.039531733777491945,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.039531733777491945\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6064516129032258,\n \"acc_stderr\": 0.027791878753132274,\n \"acc_norm\": 0.6064516129032258,\n \"acc_norm_stderr\": 0.027791878753132274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082393,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082393\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371151,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371151\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n \"acc_stderr\": 0.01536686038639711,\n \"acc_norm\": 0.3027932960893855,\n \"acc_norm_stderr\": 0.01536686038639711\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.032357437893550424,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.032357437893550424\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.017374520482513704,\n \"mc2\": 0.7056173366389187,\n \"mc2_stderr\": 0.015050090065464976\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836675\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.400303260045489,\n \"acc_stderr\": 0.013495926436566438\n }\n}\n```", "repo_url": "https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|arc:challenge|25_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|gsm8k|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hellaswag|10_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-53.024904.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["**/details_harness|winogrande|5_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T09-52-53.024904.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T09_52_53.024904", "path": ["results_2024-01-22T09-52-53.024904.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T09-52-53.024904.parquet"]}]}]} | 2024-01-22T09:55:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1
Dataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T09:52:53.024904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T09:52:53.024904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T09:52:53.024904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
252f298e5f992eddd952bf83f37902487e9d1545 |
# Dataset Card for Evaluation run of moreh/MoMo-72B-lora-1.8.7-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [moreh/MoMo-72B-lora-1.8.7-DPO](https://huggingface.co/moreh/MoMo-72B-lora-1.8.7-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.7-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T10:33:58.465501](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.7-DPO/blob/main/results_2024-01-22T10-33-58.465501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.76953499319056,
"acc_stderr": 0.0279294705479517,
"acc_norm": 0.7716820258755411,
"acc_norm_stderr": 0.0284840002969871,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7470556249138,
"mc2_stderr": 0.014379615349295343
},
"harness|arc:challenge|25": {
"acc": 0.6800341296928327,
"acc_stderr": 0.013631345807016195,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403511
},
"harness|hellaswag|10": {
"acc": 0.6733718382792272,
"acc_stderr": 0.004680215003395925,
"acc_norm": 0.8595897231627166,
"acc_norm_stderr": 0.0034670217932838386
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.02564834125169361,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.02564834125169361
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8452830188679246,
"acc_stderr": 0.02225707555879128,
"acc_norm": 0.8452830188679246,
"acc_norm_stderr": 0.02225707555879128
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9305555555555556,
"acc_stderr": 0.02125797482283205,
"acc_norm": 0.9305555555555556,
"acc_norm_stderr": 0.02125797482283205
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.03214737302029468,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.03214737302029468
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.049512182523962604,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.049512182523962604
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7914893617021277,
"acc_stderr": 0.02655698211783873,
"acc_norm": 0.7914893617021277,
"acc_norm_stderr": 0.02655698211783873
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6957671957671958,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.6957671957671958,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.645320197044335,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.645320197044335,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.0270459488258654,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.0270459488258654
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723332,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792194,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792194
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637296,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637296
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.03041771696171748,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.03041771696171748
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673957,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673957
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5629139072847682,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.5629139072847682,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9284403669724771,
"acc_stderr": 0.011051255247815476,
"acc_norm": 0.9284403669724771,
"acc_norm_stderr": 0.011051255247815476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03179876342176853,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03179876342176853
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956715,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956715
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597453,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597453
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.0334327006286962,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.0334327006286962
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.028311601441438596,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.028311601441438596
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9157088122605364,
"acc_stderr": 0.009934966499513784,
"acc_norm": 0.9157088122605364,
"acc_norm_stderr": 0.009934966499513784
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.019685307033571946,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.019685307033571946
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7027932960893855,
"acc_stderr": 0.015285313353641597,
"acc_norm": 0.7027932960893855,
"acc_norm_stderr": 0.015285313353641597
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.02046417512433263,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.02046417512433263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8488745980707395,
"acc_stderr": 0.020342749744428647,
"acc_norm": 0.8488745980707395,
"acc_norm_stderr": 0.020342749744428647
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.018105414094329676,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.018105414094329676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6453900709219859,
"acc_stderr": 0.02853865002887863,
"acc_norm": 0.6453900709219859,
"acc_norm_stderr": 0.02853865002887863
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6088657105606258,
"acc_stderr": 0.01246386183998206,
"acc_norm": 0.6088657105606258,
"acc_norm_stderr": 0.01246386183998206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.02257177102549473,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.02257177102549473
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.01564306991127334,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.01564306991127334
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007643,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007643
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.02386832565759419,
"acc_norm": 0.94,
"acc_norm_stderr": 0.02386832565759419
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7470556249138,
"mc2_stderr": 0.014379615349295343
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.7862016679302501,
"acc_stderr": 0.01129305469863505
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.7-DPO | [
"region:us"
] | 2024-01-22T10:36:02+00:00 | {"pretty_name": "Evaluation run of moreh/MoMo-72B-lora-1.8.7-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [moreh/MoMo-72B-lora-1.8.7-DPO](https://huggingface.co/moreh/MoMo-72B-lora-1.8.7-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.7-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T10:33:58.465501](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.7-DPO/blob/main/results_2024-01-22T10-33-58.465501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.76953499319056,\n \"acc_stderr\": 0.0279294705479517,\n \"acc_norm\": 0.7716820258755411,\n \"acc_norm_stderr\": 0.0284840002969871,\n \"mc1\": 0.631578947368421,\n \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7470556249138,\n \"mc2_stderr\": 0.014379615349295343\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6800341296928327,\n \"acc_stderr\": 0.013631345807016195,\n \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403511\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6733718382792272,\n \"acc_stderr\": 0.004680215003395925,\n \"acc_norm\": 0.8595897231627166,\n \"acc_norm_stderr\": 0.0034670217932838386\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.02564834125169361,\n \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.02564834125169361\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8452830188679246,\n \"acc_stderr\": 0.02225707555879128,\n \"acc_norm\": 0.8452830188679246,\n \"acc_norm_stderr\": 0.02225707555879128\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9305555555555556,\n \"acc_stderr\": 0.02125797482283205,\n \"acc_norm\": 0.9305555555555556,\n \"acc_norm_stderr\": 0.02125797482283205\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.03214737302029468,\n \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.03214737302029468\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.049512182523962604,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.049512182523962604\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.02655698211783873,\n \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.02655698211783873\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.03455930201924811,\n \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.03455930201924811\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6957671957671958,\n \"acc_stderr\": 0.023695415009463087,\n \"acc_norm\": 0.6957671957671958,\n \"acc_norm_stderr\": 0.023695415009463087\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.0336612448905145,\n \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.0336612448905145\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.0270459488258654,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.0270459488258654\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723332,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723332\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637296,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637296\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.03041771696171748,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.03041771696171748\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673957,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673957\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9284403669724771,\n \"acc_stderr\": 0.011051255247815476,\n \"acc_norm\": 0.9284403669724771,\n \"acc_norm_stderr\": 0.011051255247815476\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03179876342176853,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03179876342176853\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956715,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956715\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597453\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.0334327006286962,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.0334327006286962\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.028311601441438596,\n \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.028311601441438596\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.6071428571428571,\n \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9157088122605364,\n \"acc_stderr\": 0.009934966499513784,\n \"acc_norm\": 0.9157088122605364,\n \"acc_norm_stderr\": 0.009934966499513784\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7027932960893855,\n \"acc_stderr\": 0.015285313353641597,\n \"acc_norm\": 0.7027932960893855,\n \"acc_norm_stderr\": 0.015285313353641597\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433263,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433263\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8488745980707395,\n \"acc_stderr\": 0.020342749744428647,\n \"acc_norm\": 0.8488745980707395,\n \"acc_norm_stderr\": 0.020342749744428647\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.018105414094329676,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.018105414094329676\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6453900709219859,\n \"acc_stderr\": 0.02853865002887863,\n \"acc_norm\": 0.6453900709219859,\n \"acc_norm_stderr\": 0.02853865002887863\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6088657105606258,\n \"acc_stderr\": 0.01246386183998206,\n \"acc_norm\": 0.6088657105606258,\n \"acc_norm_stderr\": 0.01246386183998206\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.02257177102549473,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.02257177102549473\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.01564306991127334,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.01564306991127334\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007643,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007643\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.02386832565759419,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.02386832565759419\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7470556249138,\n \"mc2_stderr\": 0.014379615349295343\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7862016679302501,\n \"acc_stderr\": 0.01129305469863505\n }\n}\n```", "repo_url": "https://huggingface.co/moreh/MoMo-72B-lora-1.8.7-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|arc:challenge|25_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|gsm8k|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hellaswag|10_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T10-33-58.465501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["**/details_harness|winogrande|5_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T10-33-58.465501.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T10_33_58.465501", "path": ["results_2024-01-22T10-33-58.465501.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T10-33-58.465501.parquet"]}]}]} | 2024-01-22T10:36:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of moreh/MoMo-72B-lora-1.8.7-DPO
Dataset automatically created during the evaluation run of model moreh/MoMo-72B-lora-1.8.7-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T10:33:58.465501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of moreh/MoMo-72B-lora-1.8.7-DPO\n\n\n\nDataset automatically created during the evaluation run of model moreh/MoMo-72B-lora-1.8.7-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T10:33:58.465501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of moreh/MoMo-72B-lora-1.8.7-DPO\n\n\n\nDataset automatically created during the evaluation run of model moreh/MoMo-72B-lora-1.8.7-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T10:33:58.465501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
258cb9b36404a64c292a58146d2790ea31d5e340 |
This dataset contains the Czech subset of the [`wikimedia/wikipedia`](https://huggingface.co/datasets/wikimedia/wikipedia) dataset. Each page is divided into paragraphs, stored as a list in the `chunks` column. For every paragraph, embeddings are created using the [`sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2`](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) model.
## Usage
Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("karmiq/wikipedia-embeddings-cs-e5-base", split="train")
ds[1]
```
```
{
'id': '1',
'url': 'https://cs.wikipedia.org/wiki/Astronomie',
'title': 'Astronomie',
'chunks': [
'Astronomie, řecky αστρονομία z άστρον ( astron ) hvězda a νόμος ( nomos )...',
'Myšlenky Aristotelovy rozvinul ve 2. století našeho letopočtu Klaudios Ptolemaios...',
...,
],
'embeddings': [
[0.09006806463003159, -0.009814552962779999, ...],
[0.10767366737127304, ...],
...
]
}
```
The structure makes it easy to use the dataset for implementing semantic search.
<details>
<summary>Load the data in Elasticsearch</summary>
```python
def doc_generator(data, batch_size=1000):
for batch in data.with_format("numpy").iter(batch_size):
for i, id in enumerate(batch["id"]):
output = {"id": id}
output["title"] = batch["title"][i]
output["url"] = batch["url"][i]
output["parts"] = [
{ "chunk": chunk, "embedding": embedding }
for chunk, embedding in zip(batch["chunks"][i], batch["embeddings"][i])
]
yield output
num_indexed, num_failed = 0, 0,
progress = tqdm(total=ds.num_rows, unit="doc", desc="Indexing")
for ok, info in parallel_bulk(
es,
index="wikipedia-search",
actions=doc_generator(ds),
raise_on_error=False,
):
if not ok:
print(f"ERROR {info['index']['status']}: "
f"{info['index']['error']['type']}: {info['index']['error']['caused_by']['type']}: "
f"{info['index']['error']['caused_by']['reason'][:250]}")
progress.update(1)
```
</details>
<details>
<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>
```python
import sentence_transformers
model = sentence_transformers.SentenceTransformer("sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2")
ds.set_format(type="torch", columns=["embeddings"], output_all_columns=True)
# Flatten the dataset
def explode_sequence(batch):
output = { "id": [], "url": [], "title": [], "chunk": [], "embedding": [] }
for id, url, title, chunks, embeddings in zip(
batch["id"], batch["url"], batch["title"], batch["chunks"], batch["embeddings"]
):
output["id"].extend([id for _ in range(len(chunks))])
output["url"].extend([url for _ in range(len(chunks))])
output["title"].extend([title for _ in range(len(chunks))])
output["chunk"].extend(chunks)
output["embedding"].extend(embeddings)
return output
ds_flat = ds.map(
explode_sequence,
batched=True,
remove_columns=ds.column_names,
num_proc=min(os.cpu_count(), 32),
desc="Flatten")
ds_flat
query = "Čím se zabývá fyzika?"
hits = sentence_transformers.util.semantic_search(
query_embeddings=model.encode(query),
corpus_embeddings=ds_flat["embedding"],
top_k=10)
for hit in hits[0]:
title = ds_flat[hit['corpus_id']]['title']
chunk = ds_flat[hit['corpus_id']]['chunk']
print(f"[{hit['score']:0.2f}] {textwrap.shorten(chunk, width=100, placeholder='…')} [{title}]")
# [0.90] Fyzika částic ( též částicová fyzika ) je oblast fyziky, která se zabývá částicemi. V širším smyslu… [Fyzika částic]
# [0.89] Fyzika ( z řeckého φυσικός ( fysikos ): přírodní, ze základu φύσις ( fysis ): příroda, archaicky… [Fyzika]
# ...
```
</details>
The embeddings generation took about 15 minutes on an NVIDIA A100 80GB GPU.
## License
See license of the original dataset: <https://huggingface.co/datasets/wikimedia/wikipedia>.
| karmiq/wikipedia-embeddings-cs-minilm | [
"task_categories:text-generation",
"task_categories:fill-mask",
"size_categories:100K<n<1M",
"language:cs",
"license:cc-by-sa-3.0",
"license:gfdl",
"region:us"
] | 2024-01-22T10:40:19+00:00 | {"language": ["cs"], "license": ["cc-by-sa-3.0", "gfdl"], "size_categories": ["100K<n<1M"], "task_categories": ["text-generation", "fill-mask"], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "chunks", "sequence": "string"}, {"name": "embeddings", "sequence": {"sequence": "float32"}}], "splits": [{"name": "train", "num_bytes": 3302394852, "num_examples": 534044}], "download_size": 3029969220, "dataset_size": 3302394852}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-22T10:46:46+00:00 | [] | [
"cs"
] | TAGS
#task_categories-text-generation #task_categories-fill-mask #size_categories-100K<n<1M #language-Czech #license-cc-by-sa-3.0 #license-gfdl #region-us
|
This dataset contains the Czech subset of the 'wikimedia/wikipedia' dataset. Each page is divided into paragraphs, stored as a list in the 'chunks' column. For every paragraph, embeddings are created using the 'sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2' model.
## Usage
Load the dataset:
The structure makes it easy to use the dataset for implementing semantic search.
<details>
<summary>Load the data in Elasticsearch</summary>
</details>
<details>
<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>
</details>
The embeddings generation took about 15 minutes on an NVIDIA A100 80GB GPU.
## License
See license of the original dataset: <URL
| [
"## Usage\n\nLoad the dataset:\n\n\n\n\n\nThe structure makes it easy to use the dataset for implementing semantic search.\n\n<details>\n<summary>Load the data in Elasticsearch</summary>\n\n\n</details>\n\n<details>\n<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>\n\n\n</details>\n\nThe embeddings generation took about 15 minutes on an NVIDIA A100 80GB GPU.",
"## License\n\nSee license of the original dataset: <URL"
] | [
"TAGS\n#task_categories-text-generation #task_categories-fill-mask #size_categories-100K<n<1M #language-Czech #license-cc-by-sa-3.0 #license-gfdl #region-us \n",
"## Usage\n\nLoad the dataset:\n\n\n\n\n\nThe structure makes it easy to use the dataset for implementing semantic search.\n\n<details>\n<summary>Load the data in Elasticsearch</summary>\n\n\n</details>\n\n<details>\n<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>\n\n\n</details>\n\nThe embeddings generation took about 15 minutes on an NVIDIA A100 80GB GPU.",
"## License\n\nSee license of the original dataset: <URL"
] |
ddbc2a7d969b92a943ad84bfb8e0ec306a9ae068 |
# CroissantChat SFT data
```
@misc{faysse2024croissantllm,
title={CroissantLLM: A Truly Bilingual French-English Language Model},
author={Manuel Faysse and Patrick Fernandes and Nuno M. Guerreiro and António Loison and Duarte M. Alves and Caio Corro and Nicolas Boizard and João Alves and Ricardo Rei and Pedro H. Martins and Antoni Bigata Casademunt and François Yvon and André F. T. Martins and Gautier Viaud and Céline Hudelot and Pierre Colombo},
year={2024},
eprint={2402.00786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| croissantllm/CroissantLLM-2201-sft | [
"arxiv:2402.00786",
"region:us"
] | 2024-01-22T10:55:28+00:00 | {"dataset_info": {"features": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "markdown", "struct": [{"name": "answer", "dtype": "string"}, {"name": "index", "dtype": "int64"}, {"name": "type", "dtype": "string"}]}, {"name": "text", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "lang", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "dataset", "dtype": "string"}, {"name": "task", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1184454542, "num_examples": 294220}], "download_size": 566386739, "dataset_size": 1184454542}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T08:46:36+00:00 | [
"2402.00786"
] | [] | TAGS
#arxiv-2402.00786 #region-us
|
# CroissantChat SFT data
| [
"# CroissantChat SFT data"
] | [
"TAGS\n#arxiv-2402.00786 #region-us \n",
"# CroissantChat SFT data"
] |
6cd180167f0a6d0ef194d2990c12f1eb7c643ef6 | #This dataset is collected from ImageReward for the fake class and COCO for the real class | HDanh/RealFakeDB_small | [
"task_categories:image-classification",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"region:us"
] | 2024-01-22T11:02:24+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "fake", "1": "real"}}}}], "splits": [{"name": "train", "num_bytes": 10881873439.327, "num_examples": 98163}, {"name": "validation", "num_bytes": 574289333.296, "num_examples": 5168}, {"name": "test", "num_bytes": 592123012.48, "num_examples": 5440}], "download_size": 13085799986, "dataset_size": 12048285785.102999}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-23T03:19:40+00:00 | [] | [
"en"
] | TAGS
#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-mit #region-us
| #This dataset is collected from ImageReward for the fake class and COCO for the real class | [] | [
"TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-mit #region-us \n"
] |
93ee02366b239d5e24643411bc6f502a7d39ffeb | # Dataset Card for "testpapercomments-ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | davanstrien/testpapercomments-ds | [
"region:us"
] | 2024-01-22T11:07:57+00:00 | {"dataset_info": {"features": [{"name": "paper_url", "dtype": "string"}, {"name": "comment", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 502672, "num_examples": 456}], "download_size": 0, "dataset_size": 502672}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-22T11:22:32+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "testpapercomments-ds"
More Information needed | [
"# Dataset Card for \"testpapercomments-ds\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"testpapercomments-ds\"\n\nMore Information needed"
] |
9354987206020720241edba279e9dfe1031a863b |
# Dataset Card for Evaluation run of TomGrc/FusionNet_34Bx2_MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TomGrc/FusionNet_34Bx2_MoE](https://huggingface.co/TomGrc/FusionNet_34Bx2_MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TomGrc__FusionNet_34Bx2_MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T11:29:51.974520](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet_34Bx2_MoE/blob/main/results_2024-01-22T11-29-51.974520.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7677884016423521,
"acc_stderr": 0.028039750027124166,
"acc_norm": 0.7713984723671282,
"acc_norm_stderr": 0.028574402204719553,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422906,
"mc2": 0.7131206524056665,
"mc2_stderr": 0.014366676245195859
},
"harness|arc:challenge|25": {
"acc": 0.6962457337883959,
"acc_stderr": 0.01343890918477876,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.6693885680143398,
"acc_stderr": 0.004694718918225755,
"acc_norm": 0.8621788488348935,
"acc_norm_stderr": 0.003440076775300576
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474938,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474938
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8680555555555556,
"acc_stderr": 0.02830096838204443,
"acc_norm": 0.8680555555555556,
"acc_norm_stderr": 0.02830096838204443
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.02694748312149622,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.02694748312149622
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.036001056927277696,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.036001056927277696
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02306818884826112,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02306818884826112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9161290322580645,
"acc_stderr": 0.015769027496775664,
"acc_norm": 0.9161290322580645,
"acc_norm_stderr": 0.015769027496775664
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656177,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656177
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.011464523356953162,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.011464523356953162
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.019565236782930893,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.019565236782930893
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.03038416923235083,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.03038416923235083
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707946,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334879,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334879
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065522,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065522
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342323,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342323
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540637,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540637
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.901840490797546,
"acc_stderr": 0.023376180231059602,
"acc_norm": 0.901840490797546,
"acc_norm_stderr": 0.023376180231059602
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640407,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640407
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253862,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253862
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.92,
"acc_stderr": 0.027265992434429093,
"acc_norm": 0.92,
"acc_norm_stderr": 0.027265992434429093
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9106002554278416,
"acc_stderr": 0.010203017847688298,
"acc_norm": 0.9106002554278416,
"acc_norm_stderr": 0.010203017847688298
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135026,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135026
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7865921787709497,
"acc_stderr": 0.01370285993219609,
"acc_norm": 0.7865921787709497,
"acc_norm_stderr": 0.01370285993219609
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.020279402936174588,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.020279402936174588
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8360128617363344,
"acc_stderr": 0.021029576464662695,
"acc_norm": 0.8360128617363344,
"acc_norm_stderr": 0.021029576464662695
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.01810541409432967,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.01810541409432967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6453900709219859,
"acc_stderr": 0.028538650028878627,
"acc_norm": 0.6453900709219859,
"acc_norm_stderr": 0.028538650028878627
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5938722294654498,
"acc_stderr": 0.012543154588412923,
"acc_norm": 0.5938722294654498,
"acc_norm_stderr": 0.012543154588412923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8308823529411765,
"acc_stderr": 0.022770868010113018,
"acc_norm": 0.8308823529411765,
"acc_norm_stderr": 0.022770868010113018
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.015309329266969133,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.015309329266969133
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.022923004094736847,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.022923004094736847
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422906,
"mc2": 0.7131206524056665,
"mc2_stderr": 0.014366676245195859
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
},
"harness|gsm8k|5": {
"acc": 0.7088703563305534,
"acc_stderr": 0.012513215297888463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_TomGrc__FusionNet_34Bx2_MoE | [
"region:us"
] | 2024-01-22T11:32:06+00:00 | {"pretty_name": "Evaluation run of TomGrc/FusionNet_34Bx2_MoE", "dataset_summary": "Dataset automatically created during the evaluation run of model [TomGrc/FusionNet_34Bx2_MoE](https://huggingface.co/TomGrc/FusionNet_34Bx2_MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TomGrc__FusionNet_34Bx2_MoE\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T11:29:51.974520](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet_34Bx2_MoE/blob/main/results_2024-01-22T11-29-51.974520.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7677884016423521,\n \"acc_stderr\": 0.028039750027124166,\n \"acc_norm\": 0.7713984723671282,\n \"acc_norm_stderr\": 0.028574402204719553,\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422906,\n \"mc2\": 0.7131206524056665,\n \"mc2_stderr\": 0.014366676245195859\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6962457337883959,\n \"acc_stderr\": 0.01343890918477876,\n \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6693885680143398,\n \"acc_stderr\": 0.004694718918225755,\n \"acc_norm\": 0.8621788488348935,\n \"acc_norm_stderr\": 0.003440076775300576\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474938,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474938\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8680555555555556,\n \"acc_stderr\": 0.02830096838204443,\n \"acc_norm\": 0.8680555555555556,\n \"acc_norm_stderr\": 0.02830096838204443\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.02694748312149622,\n \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.02694748312149622\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.036001056927277696,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.036001056927277696\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02306818884826112,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02306818884826112\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9161290322580645,\n \"acc_stderr\": 0.015769027496775664,\n \"acc_norm\": 0.9161290322580645,\n \"acc_norm_stderr\": 0.015769027496775664\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656177,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656177\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.011464523356953162,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.011464523356953162\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.019565236782930893,\n \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.019565236782930893\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.03038416923235083,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.03038416923235083\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334879,\n \"acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334879\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065522,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065522\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342323,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342323\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.901840490797546,\n \"acc_stderr\": 0.023376180231059602,\n \"acc_norm\": 0.901840490797546,\n \"acc_norm_stderr\": 0.023376180231059602\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640407,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640407\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253862,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253862\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.027265992434429093,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.027265992434429093\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9106002554278416,\n \"acc_stderr\": 0.010203017847688298,\n \"acc_norm\": 0.9106002554278416,\n \"acc_norm_stderr\": 0.010203017847688298\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135026,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135026\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7865921787709497,\n \"acc_stderr\": 0.01370285993219609,\n \"acc_norm\": 0.7865921787709497,\n \"acc_norm_stderr\": 0.01370285993219609\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.020279402936174588,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.020279402936174588\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n \"acc_stderr\": 0.021029576464662695,\n \"acc_norm\": 0.8360128617363344,\n \"acc_norm_stderr\": 0.021029576464662695\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.01810541409432967,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.01810541409432967\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6453900709219859,\n \"acc_stderr\": 0.028538650028878627,\n \"acc_norm\": 0.6453900709219859,\n \"acc_norm_stderr\": 0.028538650028878627\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5938722294654498,\n \"acc_stderr\": 0.012543154588412923,\n \"acc_norm\": 0.5938722294654498,\n \"acc_norm_stderr\": 0.012543154588412923\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010113018,\n \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010113018\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.015309329266969133,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.015309329266969133\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736847,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736847\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422906,\n \"mc2\": 0.7131206524056665,\n \"mc2_stderr\": 0.014366676245195859\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7088703563305534,\n \"acc_stderr\": 0.012513215297888463\n }\n}\n```", "repo_url": "https://huggingface.co/TomGrc/FusionNet_34Bx2_MoE", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|arc:challenge|25_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|gsm8k|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hellaswag|10_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T11-29-51.974520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["**/details_harness|winogrande|5_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T11-29-51.974520.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T11_29_51.974520", "path": ["results_2024-01-22T11-29-51.974520.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T11-29-51.974520.parquet"]}]}]} | 2024-01-22T11:32:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of TomGrc/FusionNet_34Bx2_MoE
Dataset automatically created during the evaluation run of model TomGrc/FusionNet_34Bx2_MoE on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T11:29:51.974520(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of TomGrc/FusionNet_34Bx2_MoE\n\n\n\nDataset automatically created during the evaluation run of model TomGrc/FusionNet_34Bx2_MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T11:29:51.974520(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TomGrc/FusionNet_34Bx2_MoE\n\n\n\nDataset automatically created during the evaluation run of model TomGrc/FusionNet_34Bx2_MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T11:29:51.974520(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
78b767c96c21c7b4c3335c5b11a786a718178c69 |
# CMMMU
[**🌐 Homepage**](https://cmmmu-benchmark.github.io/) | [**🤗 Paper**](https://huggingface.co/papers/2401.11944) | [**📖 arXiv**](https://arxiv.org/pdf/2401.11944.pdf) | [**🤗 Dataset**](https://huggingface.co/datasets/m-a-p/CMMMU) | [**GitHub**](https://github.com/CMMMU-Benchmark/CMMMU)
## Introduction
CMMMU includes 12k manually collected multimodal questions from college exams, quizzes, and textbooks, covering six core disciplines: Art & Design, Business, Science, Health & Medicine, Humanities & Social Science, and Tech \& Engineering, like its companion, MMMU. These questions span 30 subjects and comprise 39 highly heterogeneous image types, such as charts, diagrams, maps, tables, music sheets, and chemical structures.

## 🏆 Mini-Leaderboard
| Model | Val (900) | Test (11K) |
|--------------------------------|:---------:|:------------:|
| GPT-4V(ision) (Playground) | **42.5** | **43.7** |
| Qwen-VL-PLUS* | 39.5 | 36.8 |
| Yi-VL-34B | 36.2 | 36.5 |
| Yi-VL-6B | 35.8 | 35.0 |
| InternVL-Chat-V1.1* | 34.7 | 34.0 |
| Qwen-VL-7B-Chat | 30.7 | 31.3 |
| SPHINX-MoE* | 29.3 | 29.5 |
| InternVL-Chat-ViT-6B-Vicuna-7B | 26.4 | 26.7 |
| InternVL-Chat-ViT-6B-Vicuna-13B| 27.4 | 26.1 |
| CogAgent-Chat | 24.6 | 23.6 |
| Emu2-Chat | 23.8 | 24.5 |
| Chinese-LLaVA | 25.5 | 23.4 |
| VisCPM | 25.2 | 22.7 |
| mPLUG-OWL2 | 20.8 | 22.2 |
| Frequent Choice | 24.1 | 26.0 |
| Random Choice | 21.6 | 21.6 |
*: results provided by the authors.
## Disclaimers
The guidelines for the annotators emphasized strict compliance with copyright and licensing rules from the initial data source, specifically avoiding materials from websites that forbid copying and redistribution.
Should you encounter any data samples potentially breaching the copyright or licensing regulations of any site, we encourage you to [contact](#contact) us. Upon verification, such samples will be promptly removed.
## Contact
- Ge Zhang: [email protected]
- Wenhao Huang: [email protected]
- Xinrun Du: [email protected]
- Bei Chen: [email protected]
- Wenhu Chen: [email protected]
- Jie Fu: [email protected]
## Citation
**BibTeX:**
```bibtex
@article{zhang2024cmmmu,
title={CMMMU: A Chinese Massive Multi-discipline Multimodal Understanding Benchmark},
author={Ge, Zhang and Xinrun, Du and Bei, Chen and Yiming, Liang and Tongxu, Luo and Tianyu, Zheng and Kang, Zhu and Yuyang, Cheng and Chunpu, Xu and Shuyue, Guo and Haoran, Zhang and Xingwei, Qu and Junjie, Wang and Ruibin, Yuan and Yizhi, Li and Zekun, Wang and Yudong, Liu and Yu-Hsuan, Tsai and Fengji, Zhang and Chenghua, Lin and Wenhao, Huang and Wenhu, Chen and Jie, Fu},
journal={arXiv preprint arXiv:2401.20847},
year={2024},
}
```
| m-a-p/CMMMU | [
"arxiv:2401.11944",
"region:us"
] | 2024-01-22T11:37:59+00:00 | {"dataset_info": [{"config_name": "art_and_design", "features": [{"name": "id", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "source_type", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "option1", "dtype": "string"}, {"name": "option2", "dtype": "string"}, {"name": "option3", "dtype": "string"}, {"name": "option4", "dtype": "string"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "image_3", "dtype": "image"}, {"name": "image_4", "dtype": "image"}, {"name": "image_5", "dtype": "image"}, {"name": "answer", "dtype": "string"}, {"name": "analysis", "dtype": "string"}, {"name": "distribution", "dtype": "string"}, {"name": "difficulty_level", "dtype": "string"}, {"name": "subcategory", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "subfield", "dtype": "string"}, {"name": "img_type", "dtype": "string"}, {"name": "image_1_filename", "dtype": "string"}, {"name": "image_2_filename", "dtype": "string"}, {"name": "image_3_filename", "dtype": "string"}, {"name": "image_4_filename", "dtype": "string"}, {"name": "image_5_filename", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 3113808.0, "num_examples": 11}, {"name": "val", "num_bytes": 25493074.0, "num_examples": 88}, {"name": "test", "num_bytes": 311985416.171, "num_examples": 1091}], "download_size": 343578142, "dataset_size": 340592298.171}, {"config_name": "business", "features": [{"name": "id", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "source_type", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "option1", "dtype": "string"}, {"name": "option2", "dtype": "string"}, {"name": "option3", "dtype": "string"}, {"name": "option4", "dtype": "string"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "image_3", "dtype": "image"}, {"name": "image_4", "dtype": "image"}, {"name": "image_5", "dtype": "image"}, {"name": "answer", "dtype": "string"}, {"name": "analysis", "dtype": "string"}, {"name": "distribution", "dtype": "string"}, {"name": "difficulty_level", "dtype": "string"}, {"name": "subcategory", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "subfield", "dtype": "string"}, {"name": "img_type", "dtype": "string"}, {"name": "image_1_filename", "dtype": "string"}, {"name": "image_2_filename", "dtype": "string"}, {"name": "image_3_filename", "dtype": "string"}, {"name": "image_4_filename", "dtype": "string"}, {"name": "image_5_filename", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 554457.0, "num_examples": 16}, {"name": "val", "num_bytes": 6152883.0, "num_examples": 126}, {"name": "test", "num_bytes": 60103968.654, "num_examples": 1538}], "download_size": 74809661, "dataset_size": 66811308.654}, {"config_name": "health_and_medicine", "features": [{"name": "id", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "source_type", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "option1", "dtype": "string"}, {"name": "option2", "dtype": "string"}, {"name": "option3", "dtype": "string"}, {"name": "option4", "dtype": "string"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "image_3", "dtype": "image"}, {"name": "image_4", "dtype": "image"}, {"name": "image_5", "dtype": "image"}, {"name": "answer", "dtype": "string"}, {"name": "analysis", "dtype": "string"}, {"name": "distribution", "dtype": "string"}, {"name": "difficulty_level", "dtype": "string"}, {"name": "subcategory", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "subfield", "dtype": "string"}, {"name": "img_type", "dtype": "string"}, {"name": "image_1_filename", "dtype": "string"}, {"name": "image_2_filename", "dtype": "string"}, {"name": "image_3_filename", "dtype": "string"}, {"name": "image_4_filename", "dtype": "string"}, {"name": "image_5_filename", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 2837834.0, "num_examples": 18}, {"name": "val", "num_bytes": 23957247.0, "num_examples": 153}, {"name": "test", "num_bytes": 204211130.315, "num_examples": 1865}], "download_size": 293138089, "dataset_size": 231006211.315}, {"config_name": "humanities_and_social_sciences", "features": [{"name": "id", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "source_type", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "option1", "dtype": "string"}, {"name": "option2", "dtype": "string"}, {"name": "option3", "dtype": "string"}, {"name": "option4", "dtype": "string"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "image_3", "dtype": "image"}, {"name": "image_4", "dtype": "image"}, {"name": "image_5", "dtype": "image"}, {"name": "answer", "dtype": "string"}, {"name": "analysis", "dtype": "string"}, {"name": "distribution", "dtype": "string"}, {"name": "difficulty_level", "dtype": "string"}, {"name": "subcategory", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "subfield", "dtype": "string"}, {"name": "img_type", "dtype": "string"}, {"name": "image_1_filename", "dtype": "string"}, {"name": "image_2_filename", "dtype": "string"}, {"name": "image_3_filename", "dtype": "string"}, {"name": "image_4_filename", "dtype": "string"}, {"name": "image_5_filename", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 1406107.0, "num_examples": 11}, {"name": "val", "num_bytes": 10657772.0, "num_examples": 85}, {"name": "test", "num_bytes": 157998857.894, "num_examples": 1038}], "download_size": 160096107, "dataset_size": 170062736.894}, {"config_name": "science", "features": [{"name": "id", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "source_type", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "option1", "dtype": "string"}, {"name": "option2", "dtype": "string"}, {"name": "option3", "dtype": "string"}, {"name": "option4", "dtype": "string"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "image_3", "dtype": "image"}, {"name": "image_4", "dtype": "image"}, {"name": "image_5", "dtype": "image"}, {"name": "answer", "dtype": "string"}, {"name": "analysis", "dtype": "string"}, {"name": "distribution", "dtype": "string"}, {"name": "difficulty_level", "dtype": "string"}, {"name": "subcategory", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "subfield", "dtype": "string"}, {"name": "img_type", "dtype": "string"}, {"name": "image_1_filename", "dtype": "string"}, {"name": "image_2_filename", "dtype": "string"}, {"name": "image_3_filename", "dtype": "string"}, {"name": "image_4_filename", "dtype": "string"}, {"name": "image_5_filename", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 3750526.0, "num_examples": 25}, {"name": "val", "num_bytes": 14762058.0, "num_examples": 204}, {"name": "test", "num_bytes": 268839654.208, "num_examples": 2494}], "download_size": 214377468, "dataset_size": 287352238.208}, {"config_name": "technology_and_engineering", "features": [{"name": "id", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "source_type", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "option1", "dtype": "string"}, {"name": "option2", "dtype": "string"}, {"name": "option3", "dtype": "string"}, {"name": "option4", "dtype": "string"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "image_3", "dtype": "image"}, {"name": "image_4", "dtype": "image"}, {"name": "image_5", "dtype": "image"}, {"name": "answer", "dtype": "string"}, {"name": "analysis", "dtype": "string"}, {"name": "distribution", "dtype": "string"}, {"name": "difficulty_level", "dtype": "string"}, {"name": "subcategory", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "subfield", "dtype": "string"}, {"name": "img_type", "dtype": "string"}, {"name": "image_1_filename", "dtype": "string"}, {"name": "image_2_filename", "dtype": "string"}, {"name": "image_3_filename", "dtype": "string"}, {"name": "image_4_filename", "dtype": "string"}, {"name": "image_5_filename", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 1518231.0, "num_examples": 31}, {"name": "val", "num_bytes": 14794870.0, "num_examples": 244}, {"name": "test", "num_bytes": 134814024.93, "num_examples": 2974}], "download_size": 213411677, "dataset_size": 151127125.93}], "configs": [{"config_name": "art_and_design", "data_files": [{"split": "dev", "path": "art_and_design/dev-*"}, {"split": "val", "path": "art_and_design/val-*"}, {"split": "test", "path": "art_and_design/test-*"}]}, {"config_name": "business", "data_files": [{"split": "dev", "path": "business/dev-*"}, {"split": "val", "path": "business/val-*"}, {"split": "test", "path": "business/test-*"}]}, {"config_name": "health_and_medicine", "data_files": [{"split": "dev", "path": "health_and_medicine/dev-*"}, {"split": "val", "path": "health_and_medicine/val-*"}, {"split": "test", "path": "health_and_medicine/test-*"}]}, {"config_name": "humanities_and_social_sciences", "data_files": [{"split": "dev", "path": "humanities_and_social_sciences/dev-*"}, {"split": "val", "path": "humanities_and_social_sciences/val-*"}, {"split": "test", "path": "humanities_and_social_sciences/test-*"}]}, {"config_name": "science", "data_files": [{"split": "dev", "path": "science/dev-*"}, {"split": "val", "path": "science/val-*"}, {"split": "test", "path": "science/test-*"}]}, {"config_name": "technology_and_engineering", "data_files": [{"split": "dev", "path": "technology_and_engineering/dev-*"}, {"split": "val", "path": "technology_and_engineering/val-*"}, {"split": "test", "path": "technology_and_engineering/test-*"}]}]} | 2024-01-29T14:08:55+00:00 | [
"2401.11944"
] | [] | TAGS
#arxiv-2401.11944 #region-us
| CMMMU
=====
Homepage | Paper | arXiv | Dataset | GitHub
Introduction
------------
CMMMU includes 12k manually collected multimodal questions from college exams, quizzes, and textbooks, covering six core disciplines: Art & Design, Business, Science, Health & Medicine, Humanities & Social Science, and Tech & Engineering, like its companion, MMMU. These questions span 30 subjects and comprise 39 highly heterogeneous image types, such as charts, diagrams, maps, tables, music sheets, and chemical structures.
!Alt text
Mini-Leaderboard
----------------
\*: results provided by the authors.
Disclaimers
-----------
The guidelines for the annotators emphasized strict compliance with copyright and licensing rules from the initial data source, specifically avoiding materials from websites that forbid copying and redistribution.
Should you encounter any data samples potentially breaching the copyright or licensing regulations of any site, we encourage you to contact us. Upon verification, such samples will be promptly removed.
Contact
-------
* Ge Zhang: zhangge@URL
* Wenhao Huang: huangwenhao@URL
* Xinrun Du: duxinrun@URL
* Bei Chen: chenbei@URL
* Wenhu Chen: wenhuchen@URL
* Jie Fu: jiefu@URL
BibTeX:
| [] | [
"TAGS\n#arxiv-2401.11944 #region-us \n"
] |
0b0d93ebc84bb2d19cb97e4e0bf032d8d00c66b8 |
# Dataset Card for Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v4.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T11:42:12.340351](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v4.0/blob/main/results_2024-01-22T11-42-12.340351.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6540936000655238,
"acc_stderr": 0.03205816083533057,
"acc_norm": 0.6522358528859891,
"acc_norm_stderr": 0.03276420137095382,
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.6815205984888131,
"mc2_stderr": 0.01532113622068651
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725227,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869148
},
"harness|hellaswag|10": {
"acc": 0.7338179645488947,
"acc_stderr": 0.00441057343183763,
"acc_norm": 0.8878709420434177,
"acc_norm_stderr": 0.0031488032469642897
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754406,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.01665722942458631,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.01665722942458631
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.6815205984888131,
"mc2_stderr": 0.01532113622068651
},
"harness|winogrande|5": {
"acc": 0.909234411996843,
"acc_stderr": 0.008073868876783524
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.01274030571737627
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v4.0 | [
"region:us"
] | 2024-01-22T11:44:28+00:00 | {"pretty_name": "Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v4.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T11:42:12.340351](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v4.0/blob/main/results_2024-01-22T11-42-12.340351.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6540936000655238,\n \"acc_stderr\": 0.03205816083533057,\n \"acc_norm\": 0.6522358528859891,\n \"acc_norm_stderr\": 0.03276420137095382,\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6815205984888131,\n \"mc2_stderr\": 0.01532113622068651\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725227,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869148\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7338179645488947,\n \"acc_stderr\": 0.00441057343183763,\n \"acc_norm\": 0.8878709420434177,\n \"acc_norm_stderr\": 0.0031488032469642897\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n \"acc_stderr\": 0.01665722942458631,\n \"acc_norm\": 0.4558659217877095,\n \"acc_norm_stderr\": 0.01665722942458631\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6815205984888131,\n \"mc2_stderr\": 0.01532113622068651\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.909234411996843,\n \"acc_stderr\": 0.008073868876783524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \"acc_stderr\": 0.01274030571737627\n }\n}\n```", "repo_url": "https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|arc:challenge|25_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|gsm8k|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hellaswag|10_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T11-42-12.340351.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["**/details_harness|winogrande|5_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T11-42-12.340351.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T11_42_12.340351", "path": ["results_2024-01-22T11-42-12.340351.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T11-42-12.340351.parquet"]}]}]} | 2024-01-22T11:44:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0
Dataset automatically created during the evaluation run of model zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T11:42:12.340351(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0\n\n\n\nDataset automatically created during the evaluation run of model zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T11:42:12.340351(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0\n\n\n\nDataset automatically created during the evaluation run of model zhengr/MixTAO-7Bx2-MoE-Instruct-v4.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T11:42:12.340351(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
87bae63b1be7e11d60e6c892fe69b64b79380808 |
# ViP-LLaVA Instruct Dataset Card
## Dataset details
**Dataset type:**
ViP-LLaVA Instruct is composed of a mixture of LLaVA-1.5 instruction data and the region-level visual prompting data.
It is constructed for visual instruction tuning and for building large multimodal towards GPT-4 level regional understanding capability.
Specifically, we use 1.2M data for stage 2 finetuning, and use 26K data for the optional stage 3 finetuning.
**Dataset date:**
ViP-LLaVA Instruct was collected in November 2023, by using a mixture of academic dataset and GPT-4/GPT-4V instructed dataset.
**Paper or resources for more information:**
https://vip-llava.github.io/
**License:**
Apache-2.0; and it should abide by the policy of OpenAI: https://openai.com/policies/terms-of-use
**Where to send questions or comments about the model:**
https://github.com/mu-cai/ViP-LLaVA/issues
## Intended use
**Primary intended uses:**
The primary use of ViP-LLaVA is research on large multimodal models and chatbots.
**Primary intended users:**
The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence. | mucai/ViP-LLaVA-Instruct | [
"task_categories:visual-question-answering",
"task_categories:question-answering",
"size_categories:1M<n<10M",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-22T11:51:53+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M", "10K<n<100K"], "task_categories": ["visual-question-answering", "question-answering"], "pretty_name": "ViP-LLaVA Visual Instruct"} | 2024-01-23T10:00:31+00:00 | [] | [
"en"
] | TAGS
#task_categories-visual-question-answering #task_categories-question-answering #size_categories-1M<n<10M #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
|
# ViP-LLaVA Instruct Dataset Card
## Dataset details
Dataset type:
ViP-LLaVA Instruct is composed of a mixture of LLaVA-1.5 instruction data and the region-level visual prompting data.
It is constructed for visual instruction tuning and for building large multimodal towards GPT-4 level regional understanding capability.
Specifically, we use 1.2M data for stage 2 finetuning, and use 26K data for the optional stage 3 finetuning.
Dataset date:
ViP-LLaVA Instruct was collected in November 2023, by using a mixture of academic dataset and GPT-4/GPT-4V instructed dataset.
Paper or resources for more information:
URL
License:
Apache-2.0; and it should abide by the policy of OpenAI: URL
Where to send questions or comments about the model:
URL
## Intended use
Primary intended uses:
The primary use of ViP-LLaVA is research on large multimodal models and chatbots.
Primary intended users:
The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence. | [
"# ViP-LLaVA Instruct Dataset Card",
"## Dataset details\n\nDataset type:\nViP-LLaVA Instruct is composed of a mixture of LLaVA-1.5 instruction data and the region-level visual prompting data. \nIt is constructed for visual instruction tuning and for building large multimodal towards GPT-4 level regional understanding capability.\n\nSpecifically, we use 1.2M data for stage 2 finetuning, and use 26K data for the optional stage 3 finetuning. \n\nDataset date:\nViP-LLaVA Instruct was collected in November 2023, by using a mixture of academic dataset and GPT-4/GPT-4V instructed dataset.\n\nPaper or resources for more information:\nURL\n\nLicense:\nApache-2.0; and it should abide by the policy of OpenAI: URL\n\nWhere to send questions or comments about the model:\nURL",
"## Intended use\nPrimary intended uses:\nThe primary use of ViP-LLaVA is research on large multimodal models and chatbots.\n\nPrimary intended users:\nThe primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence."
] | [
"TAGS\n#task_categories-visual-question-answering #task_categories-question-answering #size_categories-1M<n<10M #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n",
"# ViP-LLaVA Instruct Dataset Card",
"## Dataset details\n\nDataset type:\nViP-LLaVA Instruct is composed of a mixture of LLaVA-1.5 instruction data and the region-level visual prompting data. \nIt is constructed for visual instruction tuning and for building large multimodal towards GPT-4 level regional understanding capability.\n\nSpecifically, we use 1.2M data for stage 2 finetuning, and use 26K data for the optional stage 3 finetuning. \n\nDataset date:\nViP-LLaVA Instruct was collected in November 2023, by using a mixture of academic dataset and GPT-4/GPT-4V instructed dataset.\n\nPaper or resources for more information:\nURL\n\nLicense:\nApache-2.0; and it should abide by the policy of OpenAI: URL\n\nWhere to send questions or comments about the model:\nURL",
"## Intended use\nPrimary intended uses:\nThe primary use of ViP-LLaVA is research on large multimodal models and chatbots.\n\nPrimary intended users:\nThe primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence."
] |
159610c7d3a07ec7dffafe54f4faa52e8ab89367 |
This dataset contains the Czech subset of the [`wikimedia/wikipedia`](https://huggingface.co/datasets/wikimedia/wikipedia) dataset. Each page is divided into paragraphs, stored as a list in the `chunks` column. For every paragraph, embeddings are created using the [`intfloat/multilingual-e5-base`](https://huggingface.co/intfloat/multilingual-e5-base) model.
## Usage
Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("karmiq/wikipedia-embeddings-cs-e5-base", split="train")
ds[1]
```
```
{
'id': '1',
'url': 'https://cs.wikipedia.org/wiki/Astronomie',
'title': 'Astronomie',
'chunks': [
'Astronomie, řecky αστρονομία z άστρον ( astron ) hvězda a νόμος ( nomos )...',
'Myšlenky Aristotelovy rozvinul ve 2. století našeho letopočtu Klaudios Ptolemaios...',
...,
],
'embeddings': [
[0.09006806463003159, -0.009814552962779999, ...],
[0.10767366737127304, ...],
...
]
}
```
The structure makes it easy to use the dataset for implementing semantic search.
<details>
<summary>Load the data in Elasticsearch</summary>
```python
def doc_generator(data, batch_size=1000):
for batch in data.with_format("numpy").iter(batch_size):
for i, id in enumerate(batch["id"]):
output = {"id": id}
output["title"] = batch["title"][i]
output["url"] = batch["url"][i]
output["parts"] = [
{ "chunk": chunk, "embedding": embedding }
for chunk, embedding in zip(batch["chunks"][i], batch["embeddings"][i])
]
yield output
num_indexed, num_failed = 0, 0,
progress = tqdm(total=ds.num_rows, unit="doc", desc="Indexing")
for ok, info in parallel_bulk(
es,
index="wikipedia-search",
actions=doc_generator(ds),
raise_on_error=False,
):
if not ok:
print(f"ERROR {info['index']['status']}: "
f"{info['index']['error']['type']}: {info['index']['error']['caused_by']['type']}: "
f"{info['index']['error']['caused_by']['reason'][:250]}")
progress.update(1)
```
</details>
<details>
<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>
```python
import sentence_transformers
model = sentence_transformers.SentenceTransformer("intfloat/multilingual-e5-base")
ds.set_format(type="torch", columns=["embeddings"], output_all_columns=True)
# Flatten the dataset
def explode_sequence(batch):
output = { "id": [], "url": [], "title": [], "chunk": [], "embedding": [] }
for id, url, title, chunks, embeddings in zip(
batch["id"], batch["url"], batch["title"], batch["chunks"], batch["embeddings"]
):
output["id"].extend([id for _ in range(len(chunks))])
output["url"].extend([url for _ in range(len(chunks))])
output["title"].extend([title for _ in range(len(chunks))])
output["chunk"].extend(chunks)
output["embedding"].extend(embeddings)
return output
ds_flat = ds.map(
explode_sequence,
batched=True,
remove_columns=ds.column_names,
num_proc=min(os.cpu_count(), 32),
desc="Flatten")
ds_flat
query = "Čím se zabývá fyzika?"
hits = sentence_transformers.util.semantic_search(
query_embeddings=model.encode(query),
corpus_embeddings=ds_flat["embedding"],
top_k=10)
for hit in hits[0]:
title = ds_flat[hit['corpus_id']]['title']
chunk = ds_flat[hit['corpus_id']]['chunk']
print(f"[{hit['score']:0.2f}] {textwrap.shorten(chunk, width=100, placeholder='…')} [{title}]")
# [0.90] Fyzika částic ( též částicová fyzika ) je oblast fyziky, která se zabývá částicemi. V širším smyslu… [Fyzika částic]
# [0.89] Fyzika ( z řeckého φυσικός ( fysikos ): přírodní, ze základu φύσις ( fysis ): příroda, archaicky… [Fyzika]
# ...
```
</details>
The embeddings generation took about 2 hours on an NVIDIA A100 80GB GPU.
## License
See license of the original dataset: <https://huggingface.co/datasets/wikimedia/wikipedia>.
| karmiq/wikipedia-embeddings-cs-e5-base | [
"task_categories:text-generation",
"task_categories:fill-mask",
"size_categories:100K<n<1M",
"language:cs",
"license:cc-by-sa-3.0",
"license:gfdl",
"region:us"
] | 2024-01-22T11:57:02+00:00 | {"language": ["cs"], "license": ["cc-by-sa-3.0", "gfdl"], "size_categories": ["100K<n<1M"], "task_categories": ["text-generation", "fill-mask"], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "chunks", "sequence": "string"}, {"name": "embeddings", "sequence": {"sequence": "float32"}}], "splits": [{"name": "train", "num_bytes": 5021489124, "num_examples": 534044}], "download_size": 4750515911, "dataset_size": 5021489124}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-22T12:07:52+00:00 | [] | [
"cs"
] | TAGS
#task_categories-text-generation #task_categories-fill-mask #size_categories-100K<n<1M #language-Czech #license-cc-by-sa-3.0 #license-gfdl #region-us
|
This dataset contains the Czech subset of the 'wikimedia/wikipedia' dataset. Each page is divided into paragraphs, stored as a list in the 'chunks' column. For every paragraph, embeddings are created using the 'intfloat/multilingual-e5-base' model.
## Usage
Load the dataset:
The structure makes it easy to use the dataset for implementing semantic search.
<details>
<summary>Load the data in Elasticsearch</summary>
</details>
<details>
<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>
</details>
The embeddings generation took about 2 hours on an NVIDIA A100 80GB GPU.
## License
See license of the original dataset: <URL
| [
"## Usage\n\nLoad the dataset:\n\n\n\n\n\nThe structure makes it easy to use the dataset for implementing semantic search.\n\n<details>\n<summary>Load the data in Elasticsearch</summary>\n\n\n</details>\n\n<details>\n<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>\n\n\n</details>\n\nThe embeddings generation took about 2 hours on an NVIDIA A100 80GB GPU.",
"## License\n\nSee license of the original dataset: <URL"
] | [
"TAGS\n#task_categories-text-generation #task_categories-fill-mask #size_categories-100K<n<1M #language-Czech #license-cc-by-sa-3.0 #license-gfdl #region-us \n",
"## Usage\n\nLoad the dataset:\n\n\n\n\n\nThe structure makes it easy to use the dataset for implementing semantic search.\n\n<details>\n<summary>Load the data in Elasticsearch</summary>\n\n\n</details>\n\n<details>\n<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>\n\n\n</details>\n\nThe embeddings generation took about 2 hours on an NVIDIA A100 80GB GPU.",
"## License\n\nSee license of the original dataset: <URL"
] |
d6413ca115423ff04ea671fcd7bcaff2f219919a |
This dataset is used on the paper ["Replicable Benchmarking of Neural Machine Translation (NMT) on Low-Resource Local Languages in Indonesia"](https://arxiv.org/abs/2311.00998).
This repository contains two types of data:
1. Monolingual (*.txt)
2. Bilingual (*.tsv)
If used, please cite
```
@misc{susanto2023replicable,
title={Replicable Benchmarking of Neural Machine Translation (NMT) on Low-Resource Local Languages in Indonesia},
author={Lucky Susanto and Ryandito Diandaru and Adila Krisnadhi and Ayu Purwarianti and Derry Wijaya},
year={2023},
eprint={2311.00998},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
This dataset is licensed under the [Creative Commons Attribution 4.0 International License (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/).
You are free to:
- Share: Copy and redistribute the material in any medium or format.
- Adapt: Remix, transform, and build upon the material for any purpose, even commercially.
Under the following terms:
- Attribution: You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
See the [full text of the license](https://creativecommons.org/licenses/by/4.0/) for more details.
| Exqrch/IndonesianNMT | [
"task_categories:translation",
"language:id",
"language:jv",
"language:su",
"language:ban",
"language:min",
"arxiv:2311.00998",
"region:us"
] | 2024-01-22T13:35:57+00:00 | {"language": ["id", "jv", "su", "ban", "min"], "task_categories": ["translation"]} | 2024-01-22T14:05:37+00:00 | [
"2311.00998"
] | [
"id",
"jv",
"su",
"ban",
"min"
] | TAGS
#task_categories-translation #language-Indonesian #language-Javanese #language-Sundanese #language-Balinese #language-Minangkabau #arxiv-2311.00998 #region-us
|
This dataset is used on the paper "Replicable Benchmarking of Neural Machine Translation (NMT) on Low-Resource Local Languages in Indonesia".
This repository contains two types of data:
1. Monolingual (*.txt)
2. Bilingual (*.tsv)
If used, please cite
## License
This dataset is licensed under the Creative Commons Attribution 4.0 International License (CC BY 4.0).
You are free to:
- Share: Copy and redistribute the material in any medium or format.
- Adapt: Remix, transform, and build upon the material for any purpose, even commercially.
Under the following terms:
- Attribution: You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
See the full text of the license for more details.
| [
"## License\n\nThis dataset is licensed under the Creative Commons Attribution 4.0 International License (CC BY 4.0).\n\nYou are free to:\n- Share: Copy and redistribute the material in any medium or format.\n- Adapt: Remix, transform, and build upon the material for any purpose, even commercially.\n\nUnder the following terms:\n- Attribution: You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.\n\nSee the full text of the license for more details."
] | [
"TAGS\n#task_categories-translation #language-Indonesian #language-Javanese #language-Sundanese #language-Balinese #language-Minangkabau #arxiv-2311.00998 #region-us \n",
"## License\n\nThis dataset is licensed under the Creative Commons Attribution 4.0 International License (CC BY 4.0).\n\nYou are free to:\n- Share: Copy and redistribute the material in any medium or format.\n- Adapt: Remix, transform, and build upon the material for any purpose, even commercially.\n\nUnder the following terms:\n- Attribution: You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.\n\nSee the full text of the license for more details."
] |
0b83e62eab384c21e241c0953a70d0193c23cc88 |
# Dataset Card for Evaluation run of LordNoah/Alpaca_spin_tuned_gpt2_large
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LordNoah/Alpaca_spin_tuned_gpt2_large](https://huggingface.co/LordNoah/Alpaca_spin_tuned_gpt2_large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LordNoah__Alpaca_spin_tuned_gpt2_large",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T13:42:11.763277](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_spin_tuned_gpt2_large/blob/main/results_2024-01-22T13-42-11.763277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27225604584142943,
"acc_stderr": 0.03141590282455585,
"acc_norm": 0.27399653003630087,
"acc_norm_stderr": 0.03221603447267582,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757449,
"mc2": 0.39429285512218326,
"mc2_stderr": 0.01421822540176183
},
"harness|arc:challenge|25": {
"acc": 0.2568259385665529,
"acc_stderr": 0.0127669237941168,
"acc_norm": 0.2790102389078498,
"acc_norm_stderr": 0.013106784883601341
},
"harness|hellaswag|10": {
"acc": 0.36297550288787095,
"acc_stderr": 0.004798751281560822,
"acc_norm": 0.45120493925512845,
"acc_norm_stderr": 0.004965963647210318
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3433962264150943,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.3433962264150943,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826368,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826368
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518753,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518753
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261107,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261107
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.02489246917246284,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.02489246917246284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.035886248000917075,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.035886248000917075
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.31088082901554404,
"acc_stderr": 0.033403619062765885,
"acc_norm": 0.31088082901554404,
"acc_norm_stderr": 0.033403619062765885
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602357,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602357
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958955,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958955
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3339449541284404,
"acc_stderr": 0.020220554196736403,
"acc_norm": 0.3339449541284404,
"acc_norm_stderr": 0.020220554196736403
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510927,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510927
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.1031390134529148,
"acc_stderr": 0.020412564289839272,
"acc_norm": 0.1031390134529148,
"acc_norm_stderr": 0.020412564289839272
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969174,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969174
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755807,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755807
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20434227330779056,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.20434227330779056,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2976878612716763,
"acc_stderr": 0.024617055388677,
"acc_norm": 0.2976878612716763,
"acc_norm_stderr": 0.024617055388677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3247588424437299,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.3247588424437299,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543346,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543346
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843014,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113893,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113893
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.26838235294117646,
"acc_stderr": 0.02691748122437722,
"acc_norm": 0.26838235294117646,
"acc_norm_stderr": 0.02691748122437722
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.017362473762146623,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.017362473762146623
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683903,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683903
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757449,
"mc2": 0.39429285512218326,
"mc2_stderr": 0.01421822540176183
},
"harness|winogrande|5": {
"acc": 0.5461720599842147,
"acc_stderr": 0.013992441563707063
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.00213867030146048
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_LordNoah__Alpaca_spin_tuned_gpt2_large | [
"region:us"
] | 2024-01-22T13:43:30+00:00 | {"pretty_name": "Evaluation run of LordNoah/Alpaca_spin_tuned_gpt2_large", "dataset_summary": "Dataset automatically created during the evaluation run of model [LordNoah/Alpaca_spin_tuned_gpt2_large](https://huggingface.co/LordNoah/Alpaca_spin_tuned_gpt2_large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LordNoah__Alpaca_spin_tuned_gpt2_large\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T13:42:11.763277](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_spin_tuned_gpt2_large/blob/main/results_2024-01-22T13-42-11.763277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27225604584142943,\n \"acc_stderr\": 0.03141590282455585,\n \"acc_norm\": 0.27399653003630087,\n \"acc_norm_stderr\": 0.03221603447267582,\n \"mc1\": 0.21909424724602203,\n \"mc1_stderr\": 0.014480038578757449,\n \"mc2\": 0.39429285512218326,\n \"mc2_stderr\": 0.01421822540176183\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2568259385665529,\n \"acc_stderr\": 0.0127669237941168,\n \"acc_norm\": 0.2790102389078498,\n \"acc_norm_stderr\": 0.013106784883601341\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36297550288787095,\n \"acc_stderr\": 0.004798751281560822,\n \"acc_norm\": 0.45120493925512845,\n \"acc_norm_stderr\": 0.004965963647210318\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.03738520676119667,\n \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.03738520676119667\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3433962264150943,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.3433962264150943,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826368,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826368\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518753,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518753\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261107,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261107\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n \"acc_stderr\": 0.02489246917246284,\n \"acc_norm\": 0.25806451612903225,\n \"acc_norm_stderr\": 0.02489246917246284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.035886248000917075,\n \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.035886248000917075\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.31088082901554404,\n \"acc_stderr\": 0.033403619062765885,\n \"acc_norm\": 0.31088082901554404,\n \"acc_norm_stderr\": 0.033403619062765885\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602357,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602357\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958955,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958955\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3339449541284404,\n \"acc_stderr\": 0.020220554196736403,\n \"acc_norm\": 0.3339449541284404,\n \"acc_norm_stderr\": 0.020220554196736403\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510927,\n \"acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510927\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.1031390134529148,\n \"acc_stderr\": 0.020412564289839272,\n \"acc_norm\": 0.1031390134529148,\n \"acc_norm_stderr\": 0.020412564289839272\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969174,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969174\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755807,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755807\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20434227330779056,\n \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.20434227330779056,\n \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2976878612716763,\n \"acc_stderr\": 0.024617055388677,\n \"acc_norm\": 0.2976878612716763,\n \"acc_norm_stderr\": 0.024617055388677\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3247588424437299,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.3247588424437299,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543346,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543346\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843014,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843014\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n \"acc_stderr\": 0.010976425013113893,\n \"acc_norm\": 0.24445893089960888,\n \"acc_norm_stderr\": 0.010976425013113893\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.26838235294117646,\n \"acc_stderr\": 0.02691748122437722,\n \"acc_norm\": 0.26838235294117646,\n \"acc_norm_stderr\": 0.02691748122437722\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146623,\n \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146623\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683903,\n \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683903\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n \"mc1_stderr\": 0.014480038578757449,\n \"mc2\": 0.39429285512218326,\n \"mc2_stderr\": 0.01421822540176183\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5461720599842147,\n \"acc_stderr\": 0.013992441563707063\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \"acc_stderr\": 0.00213867030146048\n }\n}\n```", "repo_url": "https://huggingface.co/LordNoah/Alpaca_spin_tuned_gpt2_large", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|arc:challenge|25_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|gsm8k|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hellaswag|10_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T13-42-11.763277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["**/details_harness|winogrande|5_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T13-42-11.763277.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T13_42_11.763277", "path": ["results_2024-01-22T13-42-11.763277.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T13-42-11.763277.parquet"]}]}]} | 2024-01-22T13:43:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of LordNoah/Alpaca_spin_tuned_gpt2_large
Dataset automatically created during the evaluation run of model LordNoah/Alpaca_spin_tuned_gpt2_large on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T13:42:11.763277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of LordNoah/Alpaca_spin_tuned_gpt2_large\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca_spin_tuned_gpt2_large on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T13:42:11.763277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of LordNoah/Alpaca_spin_tuned_gpt2_large\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca_spin_tuned_gpt2_large on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T13:42:11.763277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
67f5a935f88be7634018ec3fa0924636be847a27 |
# Dataset Card for Evaluation run of LordNoah/Alpaca_refine_tuned_gpt2_large
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LordNoah/Alpaca_refine_tuned_gpt2_large](https://huggingface.co/LordNoah/Alpaca_refine_tuned_gpt2_large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LordNoah__Alpaca_refine_tuned_gpt2_large",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T13:48:41.144585](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_refine_tuned_gpt2_large/blob/main/results_2024-01-22T13-48-41.144585.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27073502428002855,
"acc_stderr": 0.031357406847036265,
"acc_norm": 0.27231339886188605,
"acc_norm_stderr": 0.032151283377254807,
"mc1": 0.21542227662178703,
"mc1_stderr": 0.01439190265242768,
"mc2": 0.3790988407554485,
"mc2_stderr": 0.01414566169158044
},
"harness|arc:challenge|25": {
"acc": 0.25426621160409557,
"acc_stderr": 0.01272499994515774,
"acc_norm": 0.27559726962457337,
"acc_norm_stderr": 0.013057169655761838
},
"harness|hellaswag|10": {
"acc": 0.36367257518422624,
"acc_stderr": 0.004800728138792372,
"acc_norm": 0.4509061939852619,
"acc_norm_stderr": 0.004965670398127349
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33584905660377357,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.33584905660377357,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826368,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826368
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378949,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378949
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.02286083830923207,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.02286083830923207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764822,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.03308818594415751,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.03308818594415751
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602357,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602357
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958955,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958955
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.020275265986638903,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.020275265986638903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510927,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510927
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842538,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755807,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755807
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.028286324075564404,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.028286324075564404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20178799489144317,
"acc_stderr": 0.014351702181636873,
"acc_norm": 0.20178799489144317,
"acc_norm_stderr": 0.014351702181636873
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.024547617794803838,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.024547617794803838
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.33762057877813506,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.33762057877813506,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02313237623454333,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02313237623454333
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2379400260756193,
"acc_stderr": 0.010875700787694221,
"acc_norm": 0.2379400260756193,
"acc_norm_stderr": 0.010875700787694221
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.026799562024887674,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.026799562024887674
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.017242385828779603,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.017242385828779603
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3020408163265306,
"acc_stderr": 0.029393609319879815,
"acc_norm": 0.3020408163265306,
"acc_norm_stderr": 0.029393609319879815
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017197,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017197
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.03106939026078942,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.03106939026078942
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21542227662178703,
"mc1_stderr": 0.01439190265242768,
"mc2": 0.3790988407554485,
"mc2_stderr": 0.01414566169158044
},
"harness|winogrande|5": {
"acc": 0.5493291239147593,
"acc_stderr": 0.013983928869040239
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.0023892815120772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_LordNoah__Alpaca_refine_tuned_gpt2_large | [
"region:us"
] | 2024-01-22T13:50:02+00:00 | {"pretty_name": "Evaluation run of LordNoah/Alpaca_refine_tuned_gpt2_large", "dataset_summary": "Dataset automatically created during the evaluation run of model [LordNoah/Alpaca_refine_tuned_gpt2_large](https://huggingface.co/LordNoah/Alpaca_refine_tuned_gpt2_large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LordNoah__Alpaca_refine_tuned_gpt2_large\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T13:48:41.144585](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_refine_tuned_gpt2_large/blob/main/results_2024-01-22T13-48-41.144585.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27073502428002855,\n \"acc_stderr\": 0.031357406847036265,\n \"acc_norm\": 0.27231339886188605,\n \"acc_norm_stderr\": 0.032151283377254807,\n \"mc1\": 0.21542227662178703,\n \"mc1_stderr\": 0.01439190265242768,\n \"mc2\": 0.3790988407554485,\n \"mc2_stderr\": 0.01414566169158044\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.25426621160409557,\n \"acc_stderr\": 0.01272499994515774,\n \"acc_norm\": 0.27559726962457337,\n \"acc_norm_stderr\": 0.013057169655761838\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36367257518422624,\n \"acc_stderr\": 0.004800728138792372,\n \"acc_norm\": 0.4509061939852619,\n \"acc_norm_stderr\": 0.004965670398127349\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.037150621549989056,\n \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.037150621549989056\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.33584905660377357,\n \"acc_stderr\": 0.029067220146644826,\n \"acc_norm\": 0.33584905660377357,\n \"acc_norm_stderr\": 0.029067220146644826\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826368,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826368\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378949,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378949\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.02286083830923207,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.02286083830923207\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764822,\n \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.03308818594415751,\n \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.03308818594415751\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602357,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602357\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958955,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958955\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3376146788990826,\n \"acc_stderr\": 0.020275265986638903,\n \"acc_norm\": 0.3376146788990826,\n \"acc_norm_stderr\": 0.020275265986638903\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510927,\n \"acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510927\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842538,\n \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842538\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755807,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755807\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n \"acc_stderr\": 0.028286324075564404,\n \"acc_norm\": 0.24786324786324787,\n \"acc_norm_stderr\": 0.028286324075564404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20178799489144317,\n \"acc_stderr\": 0.014351702181636873,\n \"acc_norm\": 0.20178799489144317,\n \"acc_norm_stderr\": 0.014351702181636873\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.024547617794803838,\n \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.024547617794803838\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.33762057877813506,\n \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.33762057877813506,\n \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02313237623454333,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02313237623454333\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2379400260756193,\n \"acc_stderr\": 0.010875700787694221,\n \"acc_norm\": 0.2379400260756193,\n \"acc_norm_stderr\": 0.010875700787694221\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.026799562024887674,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.026799562024887674\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.017242385828779603,\n \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.017242385828779603\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3020408163265306,\n \"acc_stderr\": 0.029393609319879815,\n \"acc_norm\": 0.3020408163265306,\n \"acc_norm_stderr\": 0.029393609319879815\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n \"acc_stderr\": 0.029475250236017197,\n \"acc_norm\": 0.22388059701492538,\n \"acc_norm_stderr\": 0.029475250236017197\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n \"acc_stderr\": 0.03106939026078942,\n \"acc_norm\": 0.19879518072289157,\n \"acc_norm_stderr\": 0.03106939026078942\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21542227662178703,\n \"mc1_stderr\": 0.01439190265242768,\n \"mc2\": 0.3790988407554485,\n \"mc2_stderr\": 0.01414566169158044\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5493291239147593,\n \"acc_stderr\": 0.013983928869040239\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.0023892815120772\n }\n}\n```", "repo_url": "https://huggingface.co/LordNoah/Alpaca_refine_tuned_gpt2_large", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|arc:challenge|25_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|gsm8k|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hellaswag|10_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T13-48-41.144585.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["**/details_harness|winogrande|5_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T13-48-41.144585.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T13_48_41.144585", "path": ["results_2024-01-22T13-48-41.144585.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T13-48-41.144585.parquet"]}]}]} | 2024-01-22T13:50:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of LordNoah/Alpaca_refine_tuned_gpt2_large
Dataset automatically created during the evaluation run of model LordNoah/Alpaca_refine_tuned_gpt2_large on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T13:48:41.144585(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of LordNoah/Alpaca_refine_tuned_gpt2_large\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca_refine_tuned_gpt2_large on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T13:48:41.144585(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of LordNoah/Alpaca_refine_tuned_gpt2_large\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca_refine_tuned_gpt2_large on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T13:48:41.144585(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
76b46352c2611fa5452f0a8ea713c772a189b537 |
# Dataset Card for Evaluation run of Kquant03/Buttercup-4x7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Buttercup-4x7B-bf16](https://huggingface.co/Kquant03/Buttercup-4x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Buttercup-4x7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T13:49:42.612013](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-4x7B-bf16/blob/main/results_2024-01-22T13-49-42.612013.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511243311108075,
"acc_stderr": 0.03209436877391809,
"acc_norm": 0.6509529561946836,
"acc_norm_stderr": 0.03275570222819912,
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.672045395418692,
"mc2_stderr": 0.01521737838329955
},
"harness|arc:challenge|25": {
"acc": 0.6877133105802048,
"acc_stderr": 0.013542598541688065,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.01310678488360133
},
"harness|hellaswag|10": {
"acc": 0.7055367456681936,
"acc_stderr": 0.004548695749620959,
"acc_norm": 0.877414857598088,
"acc_norm_stderr": 0.00327290143493977
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.03095663632856654,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.03095663632856654
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608311,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608311
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.01662399851333311,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.01662399851333311
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676008,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.672045395418692,
"mc2_stderr": 0.01521737838329955
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613975
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Buttercup-4x7B-bf16 | [
"region:us"
] | 2024-01-22T13:52:02+00:00 | {"pretty_name": "Evaluation run of Kquant03/Buttercup-4x7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Buttercup-4x7B-bf16](https://huggingface.co/Kquant03/Buttercup-4x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Buttercup-4x7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T13:49:42.612013](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-4x7B-bf16/blob/main/results_2024-01-22T13-49-42.612013.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511243311108075,\n \"acc_stderr\": 0.03209436877391809,\n \"acc_norm\": 0.6509529561946836,\n \"acc_norm_stderr\": 0.03275570222819912,\n \"mc1\": 0.5373317013463892,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.672045395418692,\n \"mc2_stderr\": 0.01521737838329955\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6877133105802048,\n \"acc_stderr\": 0.013542598541688065,\n \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.01310678488360133\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7055367456681936,\n \"acc_stderr\": 0.004548695749620959,\n \"acc_norm\": 0.877414857598088,\n \"acc_norm_stderr\": 0.00327290143493977\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.03095663632856654,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.03095663632856654\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608311,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608311\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n \"acc_stderr\": 0.01662399851333311,\n \"acc_norm\": 0.44581005586592176,\n \"acc_norm_stderr\": 0.01662399851333311\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5373317013463892,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.672045395418692,\n \"mc2_stderr\": 0.01521737838329955\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613975\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \"acc_stderr\": 0.012643544762873358\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Buttercup-4x7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|arc:challenge|25_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|gsm8k|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hellaswag|10_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T13-49-42.612013.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["**/details_harness|winogrande|5_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T13-49-42.612013.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T13_49_42.612013", "path": ["results_2024-01-22T13-49-42.612013.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T13-49-42.612013.parquet"]}]}]} | 2024-01-22T13:52:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Buttercup-4x7B-bf16
Dataset automatically created during the evaluation run of model Kquant03/Buttercup-4x7B-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T13:49:42.612013(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Buttercup-4x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Buttercup-4x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T13:49:42.612013(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Buttercup-4x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Buttercup-4x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T13:49:42.612013(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
47445d12e55a3f38abe07f577a5d2eb247fb52a6 |
- **Curated by:** [Heigke](https://github.com/Heigke)
- **Funded by:** [r3tex](https://github.com/r3tex)
- **Shared by:** [Project Nephilim](https://github.com/r3tex/nephilim)
- **Language(s) (NLP):** English
- **License:** CC
# Dataset Card for stanford-enigma-philosophy-chat dataset
Roughly 27k questions and answers inspired by articles from [Stanford Encyclopedia of Philosophy](https://plato.stanford.edu/).
The questions range all the way from Zombies to the concept of Abduction, from Metaphysics to Neuroethics and thus cover some of the essence of mathematics, logic and philosophy.
## Dataset Details
The dataset is simple and is ready for use by eg. [Mixtral 8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) in [LLama-Factory](https://github.com/hiyouga/LLaMA-Factory) with its three entries: input, instruction and output.
### Dataset Description/How to use
#### Format
The dataset is a json file with the following format:
```
[
{
"instruction": "You are an expert in philosophy.",
"input": "How would you describe William Whewell's contribution to philosophy and science in the 19th century Britain?",
"output": "William Whewell was a polymath who wrote extensively on various subjects including mechanics, mineralogy, geology, astronomy, political economy, theology, educational reform, internati>
},
{
"instruction": "You are an expert in philosophy.",
"input": "What were the key influences and associations in William Whewell's life?",
"output": "William Whewell was greatly influenced by his association with three of his fellow students at Cambridge: Charles Babbage, John Herschel, and Richard Jones."
}
]
```
#### How to use with transformers dataset
```
from datasets import load_dataset
dataset = load_dataset("Heigke/stanford-enigma-philosophy-chat")
```
#### How to use with LLama-Factory
Alter the dataset_info.json at LLaMa-Factory/data with an extra entry like below:
```
{
"stanford-enigma-philosophy-chat": {
"hf_hub_url": "Heigke/stanford-enigma-philosophy-chat"
},
"philosophy": {
"file_name": "cleaned_philosophy_dataset.json",
"file_sha1": "3a771f4d524d513be37d8d31166274d3a18a610d"
},
"alpaca_en": {
"file_name": "alpaca_data_en_52k.json",
...
```
Then use the flag ``` --dataset stanford-enigma-philosophy-chat```
Like this for example if you want to qlora train mixtral with flash attention:
```
CUDA_VISIBLE_DEVICES=2 python3 src/train_bash.py --stage sft --do_train --model_name_or_path mistralai/Mixtral-8x7B-Instruct-v0.1 --dataset stanford-enigma-philosophy-chat --template mistral --finetuning_type lora --lora_target q_proj,v_proj --output_dir path_to_sft_checkpoint_hf --overwrite_cache --per_device_train_batch_size 4 --gradient_accumulation_steps 4 --lr_scheduler_type cosine --logging_steps 10 --save_steps 1000 --learning_rate 5e-5 --num_train_epochs 3.0 --plot_loss --flash_attn --quantization_bit 4 --cache_dir /mnt/hdd1
```
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** -
- **Paper [optional]:** Coming
- **Demo [optional]:** Coming
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | Heigke/stanford-enigma-philosophy-chat | [
"license:cc",
"region:us"
] | 2024-01-22T13:57:41+00:00 | {"license": "cc"} | 2024-01-22T16:00:07+00:00 | [] | [] | TAGS
#license-cc #region-us
|
- Curated by: Heigke
- Funded by: r3tex
- Shared by: Project Nephilim
- Language(s) (NLP): English
- License: CC
# Dataset Card for stanford-enigma-philosophy-chat dataset
Roughly 27k questions and answers inspired by articles from Stanford Encyclopedia of Philosophy.
The questions range all the way from Zombies to the concept of Abduction, from Metaphysics to Neuroethics and thus cover some of the essence of mathematics, logic and philosophy.
## Dataset Details
The dataset is simple and is ready for use by eg. Mixtral 8x7B in LLama-Factory with its three entries: input, instruction and output.
### Dataset Description/How to use
#### Format
The dataset is a json file with the following format:
#### How to use with transformers dataset
#### How to use with LLama-Factory
Alter the dataset_info.json at LLaMa-Factory/data with an extra entry like below:
Then use the flag
Like this for example if you want to qlora train mixtral with flash attention:
### Dataset Sources [optional]
- Repository: -
- Paper [optional]: Coming
- Demo [optional]: Coming
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for stanford-enigma-philosophy-chat dataset\n\nRoughly 27k questions and answers inspired by articles from Stanford Encyclopedia of Philosophy.\nThe questions range all the way from Zombies to the concept of Abduction, from Metaphysics to Neuroethics and thus cover some of the essence of mathematics, logic and philosophy.",
"## Dataset Details\nThe dataset is simple and is ready for use by eg. Mixtral 8x7B in LLama-Factory with its three entries: input, instruction and output.",
"### Dataset Description/How to use",
"#### Format\n\nThe dataset is a json file with the following format:",
"#### How to use with transformers dataset",
"#### How to use with LLama-Factory\nAlter the dataset_info.json at LLaMa-Factory/data with an extra entry like below:\n\nThen use the flag \nLike this for example if you want to qlora train mixtral with flash attention:",
"### Dataset Sources [optional]\n\n\n\n- Repository: -\n- Paper [optional]: Coming\n- Demo [optional]: Coming",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#license-cc #region-us \n",
"# Dataset Card for stanford-enigma-philosophy-chat dataset\n\nRoughly 27k questions and answers inspired by articles from Stanford Encyclopedia of Philosophy.\nThe questions range all the way from Zombies to the concept of Abduction, from Metaphysics to Neuroethics and thus cover some of the essence of mathematics, logic and philosophy.",
"## Dataset Details\nThe dataset is simple and is ready for use by eg. Mixtral 8x7B in LLama-Factory with its three entries: input, instruction and output.",
"### Dataset Description/How to use",
"#### Format\n\nThe dataset is a json file with the following format:",
"#### How to use with transformers dataset",
"#### How to use with LLama-Factory\nAlter the dataset_info.json at LLaMa-Factory/data with an extra entry like below:\n\nThen use the flag \nLike this for example if you want to qlora train mixtral with flash attention:",
"### Dataset Sources [optional]\n\n\n\n- Repository: -\n- Paper [optional]: Coming\n- Demo [optional]: Coming",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b7f480fbd69a199315f5718b68171a7880bfc0a2 |
# Dataset Card for Evaluation run of Eurdem/megatron_1.1_MoE_2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eurdem/megatron_1.1_MoE_2x7B](https://huggingface.co/Eurdem/megatron_1.1_MoE_2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eurdem__megatron_1.1_MoE_2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T14:00:29.163053](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__megatron_1.1_MoE_2x7B/blob/main/results_2024-01-22T14-00-29.163053.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6536075619016702,
"acc_stderr": 0.031835849880102456,
"acc_norm": 0.6535987312796989,
"acc_norm_stderr": 0.0324956769564807,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965838,
"mc2": 0.5157817244014755,
"mc2_stderr": 0.015241425184790871
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111726,
"acc_norm": 0.6552901023890785,
"acc_norm_stderr": 0.01388881628678211
},
"harness|hellaswag|10": {
"acc": 0.6533559051981677,
"acc_stderr": 0.004749286071559565,
"acc_norm": 0.8451503684524995,
"acc_norm_stderr": 0.0036102194130614605
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493878,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493878
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524582,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524582
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608304,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.015839400406212487,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.015839400406212487
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.01275911706651801,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.01275911706651801
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740533,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740533
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965838,
"mc2": 0.5157817244014755,
"mc2_stderr": 0.015241425184790871
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156873
},
"harness|gsm8k|5": {
"acc": 0.7149355572403336,
"acc_stderr": 0.01243504233490401
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Eurdem__megatron_1.1_MoE_2x7B | [
"region:us"
] | 2024-01-22T14:02:42+00:00 | {"pretty_name": "Evaluation run of Eurdem/megatron_1.1_MoE_2x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Eurdem/megatron_1.1_MoE_2x7B](https://huggingface.co/Eurdem/megatron_1.1_MoE_2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eurdem__megatron_1.1_MoE_2x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T14:00:29.163053](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__megatron_1.1_MoE_2x7B/blob/main/results_2024-01-22T14-00-29.163053.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6536075619016702,\n \"acc_stderr\": 0.031835849880102456,\n \"acc_norm\": 0.6535987312796989,\n \"acc_norm_stderr\": 0.0324956769564807,\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965838,\n \"mc2\": 0.5157817244014755,\n \"mc2_stderr\": 0.015241425184790871\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n \"acc_norm\": 0.6552901023890785,\n \"acc_norm_stderr\": 0.01388881628678211\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6533559051981677,\n \"acc_stderr\": 0.004749286071559565,\n \"acc_norm\": 0.8451503684524995,\n \"acc_norm_stderr\": 0.0036102194130614605\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493878,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493878\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524582,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524582\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n \"acc_stderr\": 0.015839400406212487,\n \"acc_norm\": 0.3396648044692737,\n \"acc_norm_stderr\": 0.015839400406212487\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n \"acc_stderr\": 0.01275911706651801,\n \"acc_norm\": 0.4791395045632334,\n \"acc_norm_stderr\": 0.01275911706651801\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740533,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740533\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174934,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174934\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965838,\n \"mc2\": 0.5157817244014755,\n \"mc2_stderr\": 0.015241425184790871\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156873\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7149355572403336,\n \"acc_stderr\": 0.01243504233490401\n }\n}\n```", "repo_url": "https://huggingface.co/Eurdem/megatron_1.1_MoE_2x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|arc:challenge|25_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|gsm8k|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hellaswag|10_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T14-00-29.163053.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["**/details_harness|winogrande|5_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T14-00-29.163053.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T14_00_29.163053", "path": ["results_2024-01-22T14-00-29.163053.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T14-00-29.163053.parquet"]}]}]} | 2024-01-22T14:03:06+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Eurdem/megatron_1.1_MoE_2x7B
Dataset automatically created during the evaluation run of model Eurdem/megatron_1.1_MoE_2x7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T14:00:29.163053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Eurdem/megatron_1.1_MoE_2x7B\n\n\n\nDataset automatically created during the evaluation run of model Eurdem/megatron_1.1_MoE_2x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T14:00:29.163053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Eurdem/megatron_1.1_MoE_2x7B\n\n\n\nDataset automatically created during the evaluation run of model Eurdem/megatron_1.1_MoE_2x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T14:00:29.163053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
58ef27c5bf8eb132c84cf78923e3d474b2a9e300 |
## Python Copilot Audio Training using Global Functions with Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each global function has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet **dbytes** column and the associated source code **file_path** identifier.
- Rows: 49910
- Size: 62.8 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
```
{
"audio_path": "string",
"audio_type": "string",
"dbytes": "binary",
"dbytes_len": "int64",
"file_path": "string",
"file_path_len": "int64",
"lang": "string",
"lang_len": "int64",
"recsize": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-audio-copilot-training-using-functions-knowledge-graphs", data_dir="files")
```
| matlok/python-audio-copilot-training-using-function-knowledge-graphs | [
"task_categories:text-to-audio",
"task_categories:audio-to-audio",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:10K<n<100K",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"functions",
"global-functions",
"region:us"
] | 2024-01-22T14:23:44+00:00 | {"license": ["other"], "size_categories": ["10K<n<100K"], "task_categories": ["text-to-audio", "audio-to-audio", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot audio training using global functions with knowledge graphs", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-audio.func-v1_00000095.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "functions", "global-functions"]} | 2024-01-25T18:53:06+00:00 | [] | [] | TAGS
#task_categories-text-to-audio #task_categories-audio-to-audio #task_categories-question-answering #task_ids-parsing #size_categories-10K<n<100K #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #functions #global-functions #region-us
|
## Python Copilot Audio Training using Global Functions with Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each global function has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.
- Rows: 49910
- Size: 62.8 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
### How to use the dataset
| [
"## Python Copilot Audio Training using Global Functions with Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach global function has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.\n\n- Rows: 49910\n- Size: 62.8 GB\n- Data type: mp3\n- Format: narrated alpaca question and answers using two voices",
"### Schema",
"### How to use the dataset"
] | [
"TAGS\n#task_categories-text-to-audio #task_categories-audio-to-audio #task_categories-question-answering #task_ids-parsing #size_categories-10K<n<100K #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #functions #global-functions #region-us \n",
"## Python Copilot Audio Training using Global Functions with Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach global function has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.\n\n- Rows: 49910\n- Size: 62.8 GB\n- Data type: mp3\n- Format: narrated alpaca question and answers using two voices",
"### Schema",
"### How to use the dataset"
] |
2a276deda9adaefeddea7752ce7a5e6fe0034382 |
## Python Copilot Audio Training using Inheritance and Polymorphism Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each base class for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet **dbytes** column and the associated source code **file_path** identifier.
- Rows: 96874
- Size: 29.9 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
```
{
"audio_path": "string",
"audio_type": "string",
"dbytes": "binary",
"dbytes_len": "int64",
"file_path": "string",
"file_path_len": "int64",
"lang": "string",
"lang_len": "int64",
"recsize": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-audio-copilot-training-using-inheritance-knowledge-graphs", data_dir="files")
```
| matlok/python-audio-copilot-training-using-inheritance-knowledge-graphs | [
"task_categories:text-to-audio",
"task_categories:audio-to-audio",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:10K<n<100K",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"inheritance",
"region:us"
] | 2024-01-22T14:24:06+00:00 | {"license": ["other"], "size_categories": ["10K<n<100K"], "task_categories": ["text-to-audio", "audio-to-audio", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot audio training using inheritance and polymorphism knowledge graphs", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-audio.base-v1_00000291.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "inheritance"]} | 2024-01-25T18:53:35+00:00 | [] | [] | TAGS
#task_categories-text-to-audio #task_categories-audio-to-audio #task_categories-question-answering #task_ids-parsing #size_categories-10K<n<100K #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #inheritance #region-us
|
## Python Copilot Audio Training using Inheritance and Polymorphism Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each base class for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.
- Rows: 96874
- Size: 29.9 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
### How to use the dataset
| [
"## Python Copilot Audio Training using Inheritance and Polymorphism Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach base class for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.\n\n- Rows: 96874\n- Size: 29.9 GB\n- Data type: mp3\n- Format: narrated alpaca question and answers using two voices",
"### Schema",
"### How to use the dataset"
] | [
"TAGS\n#task_categories-text-to-audio #task_categories-audio-to-audio #task_categories-question-answering #task_ids-parsing #size_categories-10K<n<100K #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #inheritance #region-us \n",
"## Python Copilot Audio Training using Inheritance and Polymorphism Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach base class for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.\n\n- Rows: 96874\n- Size: 29.9 GB\n- Data type: mp3\n- Format: narrated alpaca question and answers using two voices",
"### Schema",
"### How to use the dataset"
] |
a4bfdb7dca02c9d96ea1c381e58559d565efe1fb |
## Python Copilot Audio Training using Imports with Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each imported module for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet **dbytes** column and the associated source code **file_path** identifier.
- Rows: 52086
- Size: 17.3 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
```
{
"audio_path": "string",
"audio_type": "string",
"dbytes": "binary",
"dbytes_len": "int64",
"file_path": "string",
"file_path_len": "int64",
"lang": "string",
"lang_len": "int64",
"recsize": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-audio-copilot-training-using-imports-knowledge-graphs", data_dir="files")
```
| matlok/python-audio-copilot-training-using-import-knowledge-graphs | [
"task_categories:text-to-audio",
"task_categories:audio-to-audio",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:10K<n<100K",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"imports",
"region:us"
] | 2024-01-22T14:24:31+00:00 | {"license": ["other"], "size_categories": ["10K<n<100K"], "task_categories": ["text-to-audio", "audio-to-audio", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot audio training using imports with knowledge graphs", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-audio.import-v1_00000274.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "imports"]} | 2024-01-25T18:53:20+00:00 | [] | [] | TAGS
#task_categories-text-to-audio #task_categories-audio-to-audio #task_categories-question-answering #task_ids-parsing #size_categories-10K<n<100K #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #imports #region-us
|
## Python Copilot Audio Training using Imports with Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each imported module for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.
- Rows: 52086
- Size: 17.3 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
### How to use the dataset
| [
"## Python Copilot Audio Training using Imports with Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach imported module for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.\n\n- Rows: 52086\n- Size: 17.3 GB\n- Data type: mp3\n- Format: narrated alpaca question and answers using two voices",
"### Schema",
"### How to use the dataset"
] | [
"TAGS\n#task_categories-text-to-audio #task_categories-audio-to-audio #task_categories-question-answering #task_ids-parsing #size_categories-10K<n<100K #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #imports #region-us \n",
"## Python Copilot Audio Training using Imports with Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach imported module for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.\n\n- Rows: 52086\n- Size: 17.3 GB\n- Data type: mp3\n- Format: narrated alpaca question and answers using two voices",
"### Schema",
"### How to use the dataset"
] |
3356d1e4157acd011c6b28489b46b1acf43597e0 | <h1 align="center"> DATASET-NAME: Code Reasoning, Understanding, and Execution Evaluation </h1>
<p align="center">
<a href="https://crux-eval.github.io/">🏠 Home Page</a> •
<a href="https://github.com/facebookresearch/cruxeval">💻 GitHub Repository </a> •
<a href="https://crux-eval.github.io/leaderboard.html">🏆 Leaderboard</a> •
<a href="https://crux-eval.github.io/demo.html">🔎 Sample Explorer</a>
</p>

DATASET-NAME (**C**ode **R**easoning, **U**nderstanding, and e**X**ecution **Eval**uation) is a benchmark of 800 Python functions and input-output pairs. The benchmark consists of two tasks, CRUXEval-I (input prediction) and CRUXEval-O (output prediction).
The benchmark was constructed as follows
## Dataset Description
- **Homepage:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Repository:** https://github.com/
- **Paper:** https://arxiv.org/
- **Point of Contact:** [NAME](mailto:EMAIL) | albertvillanova/test-dataset-card | [
"task_categories:text-classification",
"task_ids:multi-label-classification",
"region:us"
] | 2024-01-22T15:05:43+00:00 | {"task_categories": ["text-classification"], "task_ids": ["multi-label-classification", "toxic-comment-classification"]} | 2024-01-25T08:15:40+00:00 | [] | [] | TAGS
#task_categories-text-classification #task_ids-multi-label-classification #region-us
| <h1 align="center"> DATASET-NAME: Code Reasoning, Understanding, and Execution Evaluation </h1>
<p align="center">
<a href="URL Home Page</a> •
<a href="URL GitHub Repository </a> •
<a href="URL Leaderboard</a> •
<a href="URL Sample Explorer</a>
</p>
!image
DATASET-NAME (Code Reasoning, Understanding, and eXecution Evaluation) is a benchmark of 800 Python functions and input-output pairs. The benchmark consists of two tasks, CRUXEval-I (input prediction) and CRUXEval-O (output prediction).
The benchmark was constructed as follows
## Dataset Description
- Homepage:
- Repository: URL
- Paper: URL
- Point of Contact: NAME | [
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: URL\n- Point of Contact: NAME"
] | [
"TAGS\n#task_categories-text-classification #task_ids-multi-label-classification #region-us \n",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: URL\n- Point of Contact: NAME"
] |
1af2f0ef96f32ba5ff8d9b82b8cb5fc4810430bd | # The Security Attack Pattern (TTP) Recognition or Mapping Task
[](https://creativecommons.org/licenses/by/4.0/)
[](https://arxiv.org/abs/2401.10337)
We share in this repo the MITRE ATT&CK mapping datasets, with `training`, `validation` and `test` splits.
The datasets can be considered as an emerging and challenging `multilabel classification` NLP task, with over 600 hierarchical classes.
NOTE: due to their security nature, these datasets contain textual information about `malware` and other security aspects.
## Datasets
### TRAM
This dataset belongs to [CTID](https://mitre-engenuity.org/cybersecurity/center-for-threat-informed-defense/), is originally provided in this [github link](https://github.com/center-for-threat-informed-defense/tram).
We processed the original files (i.e., gather from all sources, remove duplicates, resolve noisy / too short text and noisy labels, remap to MITRE ATTACK 12.0) and split into training, dev and test splits.
### Procedure+
The dataset consists of two sub- datasets:
- Procedures: belong to [MITRE](https://github.com/mitre/cti/tree/master). All procedure examples from v12.0 are gathered and processed (i.e., remove markups) and split into training, dev and test splits.
- Derived procedures: we crawled the URL references for each procedure example, and extract original text from the articles that are determined to be relevant to the procedure examples. The text are processed and split into training, dev and test splits.
### Expert
The dataset is constructed from a large pool of high-quality threat reports.
The rich textual paragraphs are carefully selected and then annotated by seasoned security experts.
The dataset is also pre-split into `training`, `dev` and `test` splits. There are ~4 labels per text in the `test` split, on average.
## Citations
If you use the datasets in your research or want to refer to our work, please cite:
```
@inproceedings{nguyen-srndic-neth-ttpm,
title = "Noise Contrastive Estimation-based Matching Framework for Low-resource Security Attack Pattern Recognition",
author = "Nguyen, Tu and Šrndić, Nedim and Neth, Alexander",
booktitle = "Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics",
month = mar,
year = "2024",
publisher = "Association for Computational Linguistics",
abstract = "Tactics, Techniques and Procedures (TTPs) represent sophisticated attack patterns in the cybersecurity domain, described encyclopedically in textual knowledge bases. Identifying TTPs in cybersecurity writing, often called TTP mapping, is an important and challenging task. Conventional learning approaches often target the problem in the classical multi-class or multilabel classification setting. This setting hinders the learning ability of the model due to a large number of classes (i.e., TTPs), the inevitable skewness of the label distribution and the complex hierarchical structure of the label space. We formulate the problem in a different learning paradigm, where the assignment of a text to a TTP label is decided by the direct semantic similarity between the two, thus reducing the complexity of competing solely over the large labeling space. To that end, we propose a neural matching architecture with an effective sampling-based learn-to-compare mechanism, facilitating the learning process of the matching model despite constrained resources.",
}
```
## License
This project is licensed under the Creative Commons CC BY License, version 4.0. | tumeteor/MITRE-TTP-Mapping | [
"task_categories:text-classification",
"task_categories:question-answering",
"task_categories:zero-shot-classification",
"task_categories:sentence-similarity",
"size_categories:1K<n<10K",
"language:en",
"license:cc",
"security",
"ttp mapping",
"mitre att&ck",
"extreme multilabel ",
"multilabel classification",
"arxiv:2401.10337",
"region:us"
] | 2024-01-22T15:16:40+00:00 | {"language": ["en"], "license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["text-classification", "question-answering", "zero-shot-classification", "sentence-similarity"], "pretty_name": "Security Attack Pattern Recognition Datasets", "tags": ["security", "ttp mapping", "mitre att&ck", "extreme multilabel ", "multilabel classification"]} | 2024-01-23T09:52:13+00:00 | [
"2401.10337"
] | [
"en"
] | TAGS
#task_categories-text-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-sentence-similarity #size_categories-1K<n<10K #language-English #license-cc #security #ttp mapping #mitre att&ck #extreme multilabel #multilabel classification #arxiv-2401.10337 #region-us
| # The Security Attack Pattern (TTP) Recognition or Mapping Task
 and split into training, dev and test splits.
### Procedure+
The dataset consists of two sub- datasets:
- Procedures: belong to MITRE. All procedure examples from v12.0 are gathered and processed (i.e., remove markups) and split into training, dev and test splits.
- Derived procedures: we crawled the URL references for each procedure example, and extract original text from the articles that are determined to be relevant to the procedure examples. The text are processed and split into training, dev and test splits.
### Expert
The dataset is constructed from a large pool of high-quality threat reports.
The rich textual paragraphs are carefully selected and then annotated by seasoned security experts.
The dataset is also pre-split into 'training', 'dev' and 'test' splits. There are ~4 labels per text in the 'test' split, on average.
s
If you use the datasets in your research or want to refer to our work, please cite:
## License
This project is licensed under the Creative Commons CC BY License, version 4.0. | [
"# The Security Attack Pattern (TTP) Recognition or Mapping Task\n and split into training, dev and test splits.",
"### Procedure+\n\nThe dataset consists of two sub- datasets:\n- Procedures: belong to MITRE. All procedure examples from v12.0 are gathered and processed (i.e., remove markups) and split into training, dev and test splits.\n- Derived procedures: we crawled the URL references for each procedure example, and extract original text from the articles that are determined to be relevant to the procedure examples. The text are processed and split into training, dev and test splits.",
"### Expert\n\nThe dataset is constructed from a large pool of high-quality threat reports. \nThe rich textual paragraphs are carefully selected and then annotated by seasoned security experts.\n\nThe dataset is also pre-split into 'training', 'dev' and 'test' splits. There are ~4 labels per text in the 'test' split, on average.\n\ns\nIf you use the datasets in your research or want to refer to our work, please cite:",
"## License\nThis project is licensed under the Creative Commons CC BY License, version 4.0."
] | [
"TAGS\n#task_categories-text-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-sentence-similarity #size_categories-1K<n<10K #language-English #license-cc #security #ttp mapping #mitre att&ck #extreme multilabel #multilabel classification #arxiv-2401.10337 #region-us \n",
"# The Security Attack Pattern (TTP) Recognition or Mapping Task\n and split into training, dev and test splits.",
"### Procedure+\n\nThe dataset consists of two sub- datasets:\n- Procedures: belong to MITRE. All procedure examples from v12.0 are gathered and processed (i.e., remove markups) and split into training, dev and test splits.\n- Derived procedures: we crawled the URL references for each procedure example, and extract original text from the articles that are determined to be relevant to the procedure examples. The text are processed and split into training, dev and test splits.",
"### Expert\n\nThe dataset is constructed from a large pool of high-quality threat reports. \nThe rich textual paragraphs are carefully selected and then annotated by seasoned security experts.\n\nThe dataset is also pre-split into 'training', 'dev' and 'test' splits. There are ~4 labels per text in the 'test' split, on average.\n\ns\nIf you use the datasets in your research or want to refer to our work, please cite:",
"## License\nThis project is licensed under the Creative Commons CC BY License, version 4.0."
] |
fe0557b174c08ea0d566bdd60d72aaf944250f5b |
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-10b-v17.1-4k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-deepseek-10b-v17.1-4k](https://huggingface.co/OpenBuddy/openbuddy-deepseek-10b-v17.1-4k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-10b-v17.1-4k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T15:20:55.890442](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-10b-v17.1-4k/blob/main/results_2024-01-22T15-20-55.890442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5275033536344877,
"acc_stderr": 0.03384069886094482,
"acc_norm": 0.5358946325439762,
"acc_norm_stderr": 0.034669341786001784,
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920616,
"mc2": 0.45957802308964574,
"mc2_stderr": 0.015178526140313892
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.014609667440892567,
"acc_norm": 0.5435153583617748,
"acc_norm_stderr": 0.014555949760496442
},
"harness|hellaswag|10": {
"acc": 0.579964150567616,
"acc_stderr": 0.004925556104679422,
"acc_norm": 0.7692690699063931,
"acc_norm_stderr": 0.004204395478506433
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651283,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651283
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261117,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261117
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5870967741935483,
"acc_stderr": 0.02800913812540039,
"acc_norm": 0.5870967741935483,
"acc_norm_stderr": 0.02800913812540039
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.03452453903822039,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.03452453903822039
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070645,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070645
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.03074890536390988,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.03074890536390988
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47692307692307695,
"acc_stderr": 0.025323990861736118,
"acc_norm": 0.47692307692307695,
"acc_norm_stderr": 0.025323990861736118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.48739495798319327,
"acc_stderr": 0.032468167657521745,
"acc_norm": 0.48739495798319327,
"acc_norm_stderr": 0.032468167657521745
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969654,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.019069098363191435,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.019069098363191435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236436,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236436
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7075351213282248,
"acc_stderr": 0.016267000684598642,
"acc_norm": 0.7075351213282248,
"acc_norm_stderr": 0.016267000684598642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.026680134761679217,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.026680134761679217
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28044692737430166,
"acc_stderr": 0.01502408388332289,
"acc_norm": 0.28044692737430166,
"acc_norm_stderr": 0.01502408388332289
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631462,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631462
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.028043399858210635,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.028043399858210635
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.027431623722415,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.027431623722415
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480617,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480617
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3852672750977836,
"acc_stderr": 0.012429485434955192,
"acc_norm": 0.3852672750977836,
"acc_norm_stderr": 0.012429485434955192
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969765,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.03071356045510849,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.03071356045510849
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920616,
"mc2": 0.45957802308964574,
"mc2_stderr": 0.015178526140313892
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552673
},
"harness|gsm8k|5": {
"acc": 0.04473085670962851,
"acc_stderr": 0.005693886131407044
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-10b-v17.1-4k | [
"region:us"
] | 2024-01-22T15:23:11+00:00 | {"pretty_name": "Evaluation run of OpenBuddy/openbuddy-deepseek-10b-v17.1-4k", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-deepseek-10b-v17.1-4k](https://huggingface.co/OpenBuddy/openbuddy-deepseek-10b-v17.1-4k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-10b-v17.1-4k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T15:20:55.890442](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-10b-v17.1-4k/blob/main/results_2024-01-22T15-20-55.890442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5275033536344877,\n \"acc_stderr\": 0.03384069886094482,\n \"acc_norm\": 0.5358946325439762,\n \"acc_norm_stderr\": 0.034669341786001784,\n \"mc1\": 0.3182374541003672,\n \"mc1_stderr\": 0.016305988648920616,\n \"mc2\": 0.45957802308964574,\n \"mc2_stderr\": 0.015178526140313892\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.014609667440892567,\n \"acc_norm\": 0.5435153583617748,\n \"acc_norm_stderr\": 0.014555949760496442\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.579964150567616,\n \"acc_stderr\": 0.004925556104679422,\n \"acc_norm\": 0.7692690699063931,\n \"acc_norm_stderr\": 0.004204395478506433\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04122728707651283,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04122728707651283\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261117,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261117\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5870967741935483,\n \"acc_stderr\": 0.02800913812540039,\n \"acc_norm\": 0.5870967741935483,\n \"acc_norm_stderr\": 0.02800913812540039\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.03452453903822039,\n \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.03452453903822039\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070645,\n \"acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070645\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.03074890536390988,\n \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.03074890536390988\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736118,\n \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736118\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.48739495798319327,\n \"acc_stderr\": 0.032468167657521745,\n \"acc_norm\": 0.48739495798319327,\n \"acc_norm_stderr\": 0.032468167657521745\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969654,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.728440366972477,\n \"acc_stderr\": 0.019069098363191435,\n \"acc_norm\": 0.728440366972477,\n \"acc_norm_stderr\": 0.019069098363191435\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.029443773022594693,\n \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.029443773022594693\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n \"acc_stderr\": 0.016267000684598642,\n \"acc_norm\": 0.7075351213282248,\n \"acc_norm_stderr\": 0.016267000684598642\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679217,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679217\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28044692737430166,\n \"acc_stderr\": 0.01502408388332289,\n \"acc_norm\": 0.28044692737430166,\n \"acc_norm_stderr\": 0.01502408388332289\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631462,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631462\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n \"acc_stderr\": 0.028043399858210635,\n \"acc_norm\": 0.5787781350482315,\n \"acc_norm_stderr\": 0.028043399858210635\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.027431623722415,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.027431623722415\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480617,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480617\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3852672750977836,\n \"acc_stderr\": 0.012429485434955192,\n \"acc_norm\": 0.3852672750977836,\n \"acc_norm_stderr\": 0.012429485434955192\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969765,\n \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969765\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.03071356045510849,\n \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.03071356045510849\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n \"mc1_stderr\": 0.016305988648920616,\n \"mc2\": 0.45957802308964574,\n \"mc2_stderr\": 0.015178526140313892\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552673\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04473085670962851,\n \"acc_stderr\": 0.005693886131407044\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-deepseek-10b-v17.1-4k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|arc:challenge|25_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|gsm8k|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hellaswag|10_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T15-20-55.890442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["**/details_harness|winogrande|5_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T15-20-55.890442.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T15_20_55.890442", "path": ["results_2024-01-22T15-20-55.890442.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T15-20-55.890442.parquet"]}]}]} | 2024-01-22T15:23:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-10b-v17.1-4k
Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-deepseek-10b-v17.1-4k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T15:20:55.890442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-10b-v17.1-4k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-deepseek-10b-v17.1-4k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T15:20:55.890442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-10b-v17.1-4k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-deepseek-10b-v17.1-4k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T15:20:55.890442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
a457c3ce0f615874227f22a8e1b6c368eb87ea05 |
# Dataset Card for Evaluation run of LordNoah/Alpaca_spin_gpt2_e0_se1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LordNoah/Alpaca_spin_gpt2_e0_se1](https://huggingface.co/LordNoah/Alpaca_spin_gpt2_e0_se1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LordNoah__Alpaca_spin_gpt2_e0_se1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T15:39:18.329884](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_spin_gpt2_e0_se1/blob/main/results_2024-01-22T15-39-18.329884.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26633934320104735,
"acc_stderr": 0.03117121319128777,
"acc_norm": 0.267974422544881,
"acc_norm_stderr": 0.031986920897447174,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.01465133732460258,
"mc2": 0.38883435647490394,
"mc2_stderr": 0.014308709852398498
},
"harness|arc:challenge|25": {
"acc": 0.2568259385665529,
"acc_stderr": 0.0127669237941168,
"acc_norm": 0.27986348122866894,
"acc_norm_stderr": 0.013119040897725923
},
"harness|hellaswag|10": {
"acc": 0.36516630153355906,
"acc_stderr": 0.0048049276087731374,
"acc_norm": 0.4583748257319259,
"acc_norm_stderr": 0.0049724602068423026
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.039725528847851375,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.039725528847851375
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.32075471698113206,
"acc_stderr": 0.028727502957880274,
"acc_norm": 0.32075471698113206,
"acc_norm_stderr": 0.028727502957880274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628817,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628817
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518752,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518752
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287394,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.03027690994517826,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.03027690994517826
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3119266055045872,
"acc_stderr": 0.019862967976707245,
"acc_norm": 0.3119266055045872,
"acc_norm_stderr": 0.019862967976707245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859697,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859697
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13004484304932734,
"acc_stderr": 0.02257451942417487,
"acc_norm": 0.13004484304932734,
"acc_norm_stderr": 0.02257451942417487
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.03952301967702511,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.03952301967702511
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.19923371647509577,
"acc_stderr": 0.014283378044296415,
"acc_norm": 0.19923371647509577,
"acc_norm_stderr": 0.014283378044296415
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098407,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098407
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3054662379421222,
"acc_stderr": 0.026160584450140488,
"acc_norm": 0.3054662379421222,
"acc_norm_stderr": 0.026160584450140488
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.011064151027165441,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.011064151027165441
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541114,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541114
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148386,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148386
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27755102040816326,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.27755102040816326,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233134,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233134
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.01465133732460258,
"mc2": 0.38883435647490394,
"mc2_stderr": 0.014308709852398498
},
"harness|winogrande|5": {
"acc": 0.5516969218626677,
"acc_stderr": 0.013977171307126343
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225394
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_LordNoah__Alpaca_spin_gpt2_e0_se1 | [
"region:us"
] | 2024-01-22T15:40:41+00:00 | {"pretty_name": "Evaluation run of LordNoah/Alpaca_spin_gpt2_e0_se1", "dataset_summary": "Dataset automatically created during the evaluation run of model [LordNoah/Alpaca_spin_gpt2_e0_se1](https://huggingface.co/LordNoah/Alpaca_spin_gpt2_e0_se1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LordNoah__Alpaca_spin_gpt2_e0_se1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T15:39:18.329884](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_spin_gpt2_e0_se1/blob/main/results_2024-01-22T15-39-18.329884.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26633934320104735,\n \"acc_stderr\": 0.03117121319128777,\n \"acc_norm\": 0.267974422544881,\n \"acc_norm_stderr\": 0.031986920897447174,\n \"mc1\": 0.22643818849449204,\n \"mc1_stderr\": 0.01465133732460258,\n \"mc2\": 0.38883435647490394,\n \"mc2_stderr\": 0.014308709852398498\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2568259385665529,\n \"acc_stderr\": 0.0127669237941168,\n \"acc_norm\": 0.27986348122866894,\n \"acc_norm_stderr\": 0.013119040897725923\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36516630153355906,\n \"acc_stderr\": 0.0048049276087731374,\n \"acc_norm\": 0.4583748257319259,\n \"acc_norm_stderr\": 0.0049724602068423026\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.039725528847851375,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.039725528847851375\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.32075471698113206,\n \"acc_stderr\": 0.028727502957880274,\n \"acc_norm\": 0.32075471698113206,\n \"acc_norm_stderr\": 0.028727502957880274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628817,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628817\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518752,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518752\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730575,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730575\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287394,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.03027690994517826,\n \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.03027690994517826\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3119266055045872,\n \"acc_stderr\": 0.019862967976707245,\n \"acc_norm\": 0.3119266055045872,\n \"acc_norm_stderr\": 0.019862967976707245\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859697,\n \"acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859697\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2869198312236287,\n \"acc_stderr\": 0.029443773022594693,\n \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.029443773022594693\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13004484304932734,\n \"acc_stderr\": 0.02257451942417487,\n \"acc_norm\": 0.13004484304932734,\n \"acc_norm_stderr\": 0.02257451942417487\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.34710743801652894,\n \"acc_stderr\": 0.043457245702925335,\n \"acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.043457245702925335\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n \"acc_stderr\": 0.03952301967702511,\n \"acc_norm\": 0.22321428571428573,\n \"acc_norm_stderr\": 0.03952301967702511\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.19923371647509577,\n \"acc_stderr\": 0.014283378044296415,\n \"acc_norm\": 0.19923371647509577,\n \"acc_norm_stderr\": 0.014283378044296415\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n \"acc_stderr\": 0.014378169884098407,\n \"acc_norm\": 0.2446927374301676,\n \"acc_norm_stderr\": 0.014378169884098407\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3054662379421222,\n \"acc_stderr\": 0.026160584450140488,\n \"acc_norm\": 0.3054662379421222,\n \"acc_norm_stderr\": 0.026160584450140488\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658533,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658533\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n \"acc_stderr\": 0.011064151027165441,\n \"acc_norm\": 0.2503259452411995,\n \"acc_norm_stderr\": 0.011064151027165441\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541114,\n \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541114\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148386,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148386\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.2835820895522388,\n \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n \"acc_stderr\": 0.03384429155233134,\n \"acc_norm\": 0.25301204819277107,\n \"acc_norm_stderr\": 0.03384429155233134\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n \"mc1_stderr\": 0.01465133732460258,\n \"mc2\": 0.38883435647490394,\n \"mc2_stderr\": 0.014308709852398498\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5516969218626677,\n \"acc_stderr\": 0.013977171307126343\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225394\n }\n}\n```", "repo_url": "https://huggingface.co/LordNoah/Alpaca_spin_gpt2_e0_se1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|arc:challenge|25_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|gsm8k|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hellaswag|10_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T15-39-18.329884.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["**/details_harness|winogrande|5_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T15-39-18.329884.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T15_39_18.329884", "path": ["results_2024-01-22T15-39-18.329884.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T15-39-18.329884.parquet"]}]}]} | 2024-01-22T15:41:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of LordNoah/Alpaca_spin_gpt2_e0_se1
Dataset automatically created during the evaluation run of model LordNoah/Alpaca_spin_gpt2_e0_se1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T15:39:18.329884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of LordNoah/Alpaca_spin_gpt2_e0_se1\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca_spin_gpt2_e0_se1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T15:39:18.329884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of LordNoah/Alpaca_spin_gpt2_e0_se1\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca_spin_gpt2_e0_se1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T15:39:18.329884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
db134dbc02461edb6139b1b4f94b07991d1f141b | Dataset created from the 20k_random_data.txt file linked by [kalomaze](https://github.com/kalomaze) here: [https://github.com/ggerganov/llama.cpp/discussions/5006](https://github.com/ggerganov/llama.cpp/discussions/5006#discussioncomment-8163190) | llmixer/20k_random_data | [
"20k",
"random",
"region:us"
] | 2024-01-22T15:54:27+00:00 | {"tags": ["20k", "random"]} | 2024-01-22T15:58:11+00:00 | [] | [] | TAGS
#20k #random #region-us
| Dataset created from the 20k_random_data.txt file linked by kalomaze here: URL | [] | [
"TAGS\n#20k #random #region-us \n"
] |
a7d0cda6e629f7cc258da6e805271e577903c75d |
# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0](https://huggingface.co/abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T15:53:57.776809](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0/blob/main/results_2024-01-22T15-53-57.776809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5840566611683854,
"acc_stderr": 0.0336733702398527,
"acc_norm": 0.5887774362139145,
"acc_norm_stderr": 0.03437128171361439,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5954200996176123,
"mc2_stderr": 0.01553089056885833
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.014560220308714702,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.01435639941800912
},
"harness|hellaswag|10": {
"acc": 0.630551682931687,
"acc_stderr": 0.00481669012320976,
"acc_norm": 0.8265285799641505,
"acc_norm_stderr": 0.003778804474605908
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.02811209121011746,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.02811209121011746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.02528558599001786,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.02528558599001786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.03201650100739611,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.03201650100739611
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.038969819642573754,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.038969819642573754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459156,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459156
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734198,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734198
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584197,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584197
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3642458100558659,
"acc_stderr": 0.016094338768474593,
"acc_norm": 0.3642458100558659,
"acc_norm_stderr": 0.016094338768474593
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159607,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159607
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893937,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893937
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719964,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719964
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4061277705345502,
"acc_stderr": 0.012543154588412932,
"acc_norm": 0.4061277705345502,
"acc_norm_stderr": 0.012543154588412932
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.030233758551596445,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.030233758551596445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5954200996176123,
"mc2_stderr": 0.01553089056885833
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205207
},
"harness|gsm8k|5": {
"acc": 0.3601213040181956,
"acc_stderr": 0.013222559423250485
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0 | [
"region:us"
] | 2024-01-22T15:56:13+00:00 | {"pretty_name": "Evaluation run of abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0](https://huggingface.co/abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T15:53:57.776809](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0/blob/main/results_2024-01-22T15-53-57.776809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5840566611683854,\n \"acc_stderr\": 0.0336733702398527,\n \"acc_norm\": 0.5887774362139145,\n \"acc_norm_stderr\": 0.03437128171361439,\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5954200996176123,\n \"mc2_stderr\": 0.01553089056885833\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714702,\n \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.01435639941800912\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.630551682931687,\n \"acc_stderr\": 0.00481669012320976,\n \"acc_norm\": 0.8265285799641505,\n \"acc_norm_stderr\": 0.003778804474605908\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.5259259259259259,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011746,\n \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011746\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.02528558599001786,\n \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.02528558599001786\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.03201650100739611,\n \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.03201650100739611\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.6098654708520179,\n \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n \"acc_stderr\": 0.015274685213734198,\n \"acc_norm\": 0.7598978288633461,\n \"acc_norm_stderr\": 0.015274685213734198\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584197,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584197\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n \"acc_stderr\": 0.016094338768474593,\n \"acc_norm\": 0.3642458100558659,\n \"acc_norm_stderr\": 0.016094338768474593\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159607,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159607\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n \"acc_stderr\": 0.026795422327893937,\n \"acc_norm\": 0.6655948553054662,\n \"acc_norm_stderr\": 0.026795422327893937\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719964,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719964\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4061277705345502,\n \"acc_stderr\": 0.012543154588412932,\n \"acc_norm\": 0.4061277705345502,\n \"acc_norm_stderr\": 0.012543154588412932\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596445,\n \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5954200996176123,\n \"mc2_stderr\": 0.01553089056885833\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205207\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3601213040181956,\n \"acc_stderr\": 0.013222559423250485\n }\n}\n```", "repo_url": "https://huggingface.co/abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|arc:challenge|25_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|gsm8k|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hellaswag|10_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T15-53-57.776809.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["**/details_harness|winogrande|5_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T15-53-57.776809.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T15_53_57.776809", "path": ["results_2024-01-22T15-53-57.776809.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T15-53-57.776809.parquet"]}]}]} | 2024-01-22T15:56:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0
Dataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T15:53:57.776809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T15:53:57.776809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-Mistral-7B-Instruct-v0.2-5000-v2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T15:53:57.776809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d4e1492ee5505d84df4f1de684f641830fcf2034 |
# Dataset Card for Evaluation run of LordNoah/Alpaca_refine_gpt2_e0_se1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LordNoah/Alpaca_refine_gpt2_e0_se1](https://huggingface.co/LordNoah/Alpaca_refine_gpt2_e0_se1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T16:13:20.086955](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1/blob/main/results_2024-01-22T16-13-20.086955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2707744494888737,
"acc_stderr": 0.031386153673103406,
"acc_norm": 0.272619145262433,
"acc_norm_stderr": 0.03218390044538922,
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506978,
"mc2": 0.37888278063696673,
"mc2_stderr": 0.014137600334109192
},
"harness|arc:challenge|25": {
"acc": 0.2645051194539249,
"acc_stderr": 0.012889272949313366,
"acc_norm": 0.29180887372013653,
"acc_norm_stderr": 0.013284525292403508
},
"harness|hellaswag|10": {
"acc": 0.36367257518422624,
"acc_stderr": 0.004800728138792374,
"acc_norm": 0.4534953196574388,
"acc_norm_stderr": 0.004968151878211051
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3283018867924528,
"acc_stderr": 0.028901593612411784,
"acc_norm": 0.3283018867924528,
"acc_norm_stderr": 0.028901593612411784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083287,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083287
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518753,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518753
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.0328264938530415,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.0328264938530415
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.034107802518361825,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.034107802518361825
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602357,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602357
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844065,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844065
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715477,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715477
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.020275265986638903,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.020275265986638903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456053,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598018,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.0432076780753667,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.0432076780753667
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20051085568326948,
"acc_stderr": 0.014317653708594207,
"acc_norm": 0.20051085568326948,
"acc_norm_stderr": 0.014317653708594207
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451163,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23402868318122555,
"acc_stderr": 0.010813585552659674,
"acc_norm": 0.23402868318122555,
"acc_norm_stderr": 0.010813585552659674
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.02604066247420126,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.02604066247420126
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.01740181671142765,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.01740181671142765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2571428571428571,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.2571428571428571,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506978,
"mc2": 0.37888278063696673,
"mc2_stderr": 0.014137600334109192
},
"harness|winogrande|5": {
"acc": 0.5430149960536701,
"acc_stderr": 0.01400038676159829
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022545087
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1 | [
"region:us"
] | 2024-01-22T16:14:40+00:00 | {"pretty_name": "Evaluation run of LordNoah/Alpaca_refine_gpt2_e0_se1", "dataset_summary": "Dataset automatically created during the evaluation run of model [LordNoah/Alpaca_refine_gpt2_e0_se1](https://huggingface.co/LordNoah/Alpaca_refine_gpt2_e0_se1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T16:13:20.086955](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1/blob/main/results_2024-01-22T16-13-20.086955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2707744494888737,\n \"acc_stderr\": 0.031386153673103406,\n \"acc_norm\": 0.272619145262433,\n \"acc_norm_stderr\": 0.03218390044538922,\n \"mc1\": 0.21664626682986537,\n \"mc1_stderr\": 0.014421468452506978,\n \"mc2\": 0.37888278063696673,\n \"mc2_stderr\": 0.014137600334109192\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2645051194539249,\n \"acc_stderr\": 0.012889272949313366,\n \"acc_norm\": 0.29180887372013653,\n \"acc_norm_stderr\": 0.013284525292403508\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36367257518422624,\n \"acc_stderr\": 0.004800728138792374,\n \"acc_norm\": 0.4534953196574388,\n \"acc_norm_stderr\": 0.004968151878211051\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.037827289808654685,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.037827289808654685\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3283018867924528,\n \"acc_stderr\": 0.028901593612411784,\n \"acc_norm\": 0.3283018867924528,\n \"acc_norm_stderr\": 0.028901593612411784\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774707,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774707\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083287,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083287\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518753,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518753\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.0328264938530415,\n \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.0328264938530415\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.034107802518361825,\n \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.034107802518361825\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602357,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602357\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844065,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844065\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715477,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715477\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3376146788990826,\n \"acc_stderr\": 0.020275265986638903,\n \"acc_norm\": 0.3376146788990826,\n \"acc_norm_stderr\": 0.020275265986638903\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.03099866630456053,\n \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03099866630456053\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598018,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598018\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.33884297520661155,\n \"acc_stderr\": 0.0432076780753667,\n \"acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.0432076780753667\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20051085568326948,\n \"acc_stderr\": 0.014317653708594207,\n \"acc_norm\": 0.20051085568326948,\n \"acc_norm_stderr\": 0.014317653708594207\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451163,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451163\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23402868318122555,\n \"acc_stderr\": 0.010813585552659674,\n \"acc_norm\": 0.23402868318122555,\n \"acc_norm_stderr\": 0.010813585552659674\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.02604066247420126,\n \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.02604066247420126\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142765,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142765\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2571428571428571,\n \"acc_stderr\": 0.027979823538744546,\n \"acc_norm\": 0.2571428571428571,\n \"acc_norm_stderr\": 0.027979823538744546\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21664626682986537,\n \"mc1_stderr\": 0.014421468452506978,\n \"mc2\": 0.37888278063696673,\n \"mc2_stderr\": 0.014137600334109192\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5430149960536701,\n \"acc_stderr\": 0.01400038676159829\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.0022675371022545087\n }\n}\n```", "repo_url": "https://huggingface.co/LordNoah/Alpaca_refine_gpt2_e0_se1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|arc:challenge|25_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|gsm8k|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hellaswag|10_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T16-13-20.086955.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["**/details_harness|winogrande|5_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T16-13-20.086955.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T16_13_20.086955", "path": ["results_2024-01-22T16-13-20.086955.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T16-13-20.086955.parquet"]}]}]} | 2024-01-22T16:15:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of LordNoah/Alpaca_refine_gpt2_e0_se1
Dataset automatically created during the evaluation run of model LordNoah/Alpaca_refine_gpt2_e0_se1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T16:13:20.086955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of LordNoah/Alpaca_refine_gpt2_e0_se1\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca_refine_gpt2_e0_se1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T16:13:20.086955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of LordNoah/Alpaca_refine_gpt2_e0_se1\n\n\n\nDataset automatically created during the evaluation run of model LordNoah/Alpaca_refine_gpt2_e0_se1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T16:13:20.086955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1fb8cdacb54a8a2393cc3cbf1078b2ec30ddc30c |
#### Dataset:
This is the data used for training [Snorkel model](https://huggingface.co/snorkelai/Snorkel-Mistral-PairRM-DPO)
We use ONLY the prompts from [UltraFeedback](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized); **no external LLM responses used**.
#### Methodology:
1. Generate 5 response variations for each prompt from a subset of 20,000 using the LLM - to start, we used [Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2).
2. Apply [PairRM](https://huggingface.co/llm-blender/PairRM) for response reranking.
3. Update the LLM by applying Direct Preference Optimization (DPO) on the top (chosen) and bottom (rejected) responses.
4. Use this LLM as the base model for the next iteration and use a different set of 20,000 prompts, repeating three times in total.
Please see the model page for more details on the methodology.
Columns:
- prompt: the current prompt
- chosen: the list of messages for the chosen response
- rejected: the list of messages for the rejected response
- all_generated_responses: The 5 generated responses
- all_rm_scores: The 5 corresponding reward model scores
Splits:
- train/test_iteration_{n}: The dataset used at the n_th iteration. We did 3 iterations in total.
**Training recipe**: This data is formatted to be compatible with the Hugging Face's [Zephyr recipe](https://github.com/huggingface/alignment-handbook/tree/main/recipes/zephyr-7b-beta).
We executed the n_th DPO iteration using the "train/test_iteration_{n}". | snorkelai/Snorkel-Mistral-PairRM-DPO-Dataset | [
"task_categories:text-generation",
"license:apache-2.0",
"region:us"
] | 2024-01-22T16:18:35+00:00 | {"license": "apache-2.0", "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "prompt_id", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "all_generated_responses", "sequence": "string"}, {"name": "all_rm_scores", "sequence": "float64"}], "splits": [{"name": "train_iteration_1", "num_bytes": 276734865, "num_examples": 19766}, {"name": "test_iteration_1", "num_bytes": 13616124, "num_examples": 994}, {"name": "train_iteration_2", "num_bytes": 313248218, "num_examples": 19958}, {"name": "test_iteration_2", "num_bytes": 15553468, "num_examples": 1000}, {"name": "train_iteration_3", "num_bytes": 379805458, "num_examples": 19996}, {"name": "test_iteration_3", "num_bytes": 19111694, "num_examples": 1000}], "download_size": 485703305, "dataset_size": 1018069827}, "configs": [{"config_name": "default", "data_files": [{"split": "train_iteration_1", "path": "data/train_iteration_1-*"}, {"split": "test_iteration_1", "path": "data/test_iteration_1-*"}, {"split": "train_iteration_2", "path": "data/train_iteration_2-*"}, {"split": "test_iteration_2", "path": "data/test_iteration_2-*"}, {"split": "train_iteration_3", "path": "data/train_iteration_3-*"}, {"split": "test_iteration_3", "path": "data/test_iteration_3-*"}]}]} | 2024-01-23T04:31:44+00:00 | [] | [] | TAGS
#task_categories-text-generation #license-apache-2.0 #region-us
|
#### Dataset:
This is the data used for training Snorkel model
We use ONLY the prompts from UltraFeedback; no external LLM responses used.
#### Methodology:
1. Generate 5 response variations for each prompt from a subset of 20,000 using the LLM - to start, we used Mistral-7B-Instruct-v0.2.
2. Apply PairRM for response reranking.
3. Update the LLM by applying Direct Preference Optimization (DPO) on the top (chosen) and bottom (rejected) responses.
4. Use this LLM as the base model for the next iteration and use a different set of 20,000 prompts, repeating three times in total.
Please see the model page for more details on the methodology.
Columns:
- prompt: the current prompt
- chosen: the list of messages for the chosen response
- rejected: the list of messages for the rejected response
- all_generated_responses: The 5 generated responses
- all_rm_scores: The 5 corresponding reward model scores
Splits:
- train/test_iteration_{n}: The dataset used at the n_th iteration. We did 3 iterations in total.
Training recipe: This data is formatted to be compatible with the Hugging Face's Zephyr recipe.
We executed the n_th DPO iteration using the "train/test_iteration_{n}". | [
"#### Dataset:\nThis is the data used for training Snorkel model\n\nWe use ONLY the prompts from UltraFeedback; no external LLM responses used.",
"#### Methodology:\n 1. Generate 5 response variations for each prompt from a subset of 20,000 using the LLM - to start, we used Mistral-7B-Instruct-v0.2.\n 2. Apply PairRM for response reranking.\n 3. Update the LLM by applying Direct Preference Optimization (DPO) on the top (chosen) and bottom (rejected) responses.\n 4. Use this LLM as the base model for the next iteration and use a different set of 20,000 prompts, repeating three times in total.\n\nPlease see the model page for more details on the methodology.\n\nColumns:\n- prompt: the current prompt\n- chosen: the list of messages for the chosen response\n- rejected: the list of messages for the rejected response\n- all_generated_responses: The 5 generated responses\n- all_rm_scores: The 5 corresponding reward model scores\n\nSplits:\n- train/test_iteration_{n}: The dataset used at the n_th iteration. We did 3 iterations in total.\n\nTraining recipe: This data is formatted to be compatible with the Hugging Face's Zephyr recipe.\nWe executed the n_th DPO iteration using the \"train/test_iteration_{n}\"."
] | [
"TAGS\n#task_categories-text-generation #license-apache-2.0 #region-us \n",
"#### Dataset:\nThis is the data used for training Snorkel model\n\nWe use ONLY the prompts from UltraFeedback; no external LLM responses used.",
"#### Methodology:\n 1. Generate 5 response variations for each prompt from a subset of 20,000 using the LLM - to start, we used Mistral-7B-Instruct-v0.2.\n 2. Apply PairRM for response reranking.\n 3. Update the LLM by applying Direct Preference Optimization (DPO) on the top (chosen) and bottom (rejected) responses.\n 4. Use this LLM as the base model for the next iteration and use a different set of 20,000 prompts, repeating three times in total.\n\nPlease see the model page for more details on the methodology.\n\nColumns:\n- prompt: the current prompt\n- chosen: the list of messages for the chosen response\n- rejected: the list of messages for the rejected response\n- all_generated_responses: The 5 generated responses\n- all_rm_scores: The 5 corresponding reward model scores\n\nSplits:\n- train/test_iteration_{n}: The dataset used at the n_th iteration. We did 3 iterations in total.\n\nTraining recipe: This data is formatted to be compatible with the Hugging Face's Zephyr recipe.\nWe executed the n_th DPO iteration using the \"train/test_iteration_{n}\"."
] |
843fc7f9d9471c1989a3f33a409123cf1c668244 | # lilac/OpenOrca
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/Open-Orca/OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-OpenOrca
```
or from python with:
```py
ll.download("lilacai/lilac-OpenOrca")
```
| lilacai/lilac-OpenOrca | [
"Lilac",
"region:us"
] | 2024-01-22T16:45:29+00:00 | {"tags": ["Lilac"]} | 2024-01-22T17:02:56+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/OpenOrca
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/OpenOrca\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/OpenOrca\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
cd186b9c100ca18fbed41bf33e63bdbd6cef37ca | from https://www.ffl.kanagawa-u.ac.jp/old/news/2015/img/news_2015060202_script_sp_dq.pdf | dmntrd/QuijoteDeLaMancha_RafaelGil | [
"region:us"
] | 2024-01-22T16:45:32+00:00 | {"dataset_info": {"features": [{"name": "chat", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 49141.81124497992, "num_examples": 199}, {"name": "test", "num_bytes": 12347.18875502008, "num_examples": 50}], "download_size": 41296, "dataset_size": 61489.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-22T16:49:43+00:00 | [] | [] | TAGS
#region-us
| from URL | [] | [
"TAGS\n#region-us \n"
] |
94942a03d1c6bad1113112499a1080f4dd5d3aec |
# Dataset Card for Evaluation run of BarryFutureman/NeuralTurdusVariant1-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarryFutureman/NeuralTurdusVariant1-7B](https://huggingface.co/BarryFutureman/NeuralTurdusVariant1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarryFutureman__NeuralTurdusVariant1-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T17:04:26.015836](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__NeuralTurdusVariant1-7B/blob/main/results_2024-01-22T17-04-26.015836.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6537211968860746,
"acc_stderr": 0.03201153964256497,
"acc_norm": 0.6529568258973308,
"acc_norm_stderr": 0.032683792409541605,
"mc1": 0.5703794369645043,
"mc1_stderr": 0.01732923458040909,
"mc2": 0.6998714475811907,
"mc2_stderr": 0.015140377928333039
},
"harness|arc:challenge|25": {
"acc": 0.7201365187713311,
"acc_stderr": 0.013119040897725922,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710696
},
"harness|hellaswag|10": {
"acc": 0.7248556064528978,
"acc_stderr": 0.004456743108170734,
"acc_norm": 0.8860784704242183,
"acc_norm_stderr": 0.0031706661225176552
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4435754189944134,
"acc_stderr": 0.01661568040100372,
"acc_norm": 0.4435754189944134,
"acc_norm_stderr": 0.01661568040100372
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.02895975519682487,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.02895975519682487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5703794369645043,
"mc1_stderr": 0.01732923458040909,
"mc2": 0.6998714475811907,
"mc2_stderr": 0.015140377928333039
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.009990706005184135
},
"harness|gsm8k|5": {
"acc": 0.6732373009855952,
"acc_stderr": 0.01291940810865641
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BarryFutureman__NeuralTurdusVariant1-7B | [
"region:us"
] | 2024-01-22T17:06:47+00:00 | {"pretty_name": "Evaluation run of BarryFutureman/NeuralTurdusVariant1-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarryFutureman/NeuralTurdusVariant1-7B](https://huggingface.co/BarryFutureman/NeuralTurdusVariant1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__NeuralTurdusVariant1-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T17:04:26.015836](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__NeuralTurdusVariant1-7B/blob/main/results_2024-01-22T17-04-26.015836.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6537211968860746,\n \"acc_stderr\": 0.03201153964256497,\n \"acc_norm\": 0.6529568258973308,\n \"acc_norm_stderr\": 0.032683792409541605,\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.6998714475811907,\n \"mc2_stderr\": 0.015140377928333039\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7201365187713311,\n \"acc_stderr\": 0.013119040897725922,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710696\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7248556064528978,\n \"acc_stderr\": 0.004456743108170734,\n \"acc_norm\": 0.8860784704242183,\n \"acc_norm_stderr\": 0.0031706661225176552\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4435754189944134,\n \"acc_stderr\": 0.01661568040100372,\n \"acc_norm\": 0.4435754189944134,\n \"acc_norm_stderr\": 0.01661568040100372\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.02895975519682487,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.02895975519682487\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.6998714475811907,\n \"mc2_stderr\": 0.015140377928333039\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184135\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6732373009855952,\n \"acc_stderr\": 0.01291940810865641\n }\n}\n```", "repo_url": "https://huggingface.co/BarryFutureman/NeuralTurdusVariant1-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|arc:challenge|25_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|gsm8k|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hellaswag|10_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T17-04-26.015836.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["**/details_harness|winogrande|5_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T17-04-26.015836.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T17_04_26.015836", "path": ["results_2024-01-22T17-04-26.015836.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T17-04-26.015836.parquet"]}]}]} | 2024-01-22T17:07:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BarryFutureman/NeuralTurdusVariant1-7B
Dataset automatically created during the evaluation run of model BarryFutureman/NeuralTurdusVariant1-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T17:04:26.015836(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BarryFutureman/NeuralTurdusVariant1-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/NeuralTurdusVariant1-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T17:04:26.015836(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BarryFutureman/NeuralTurdusVariant1-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/NeuralTurdusVariant1-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T17:04:26.015836(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
be8c2f2199a67c37320e3098a4946bddcdd1a276 |
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T17:11:06.460582](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e2/blob/main/results_2024-01-22T17-11-06.460582.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6071506593443939,
"acc_stderr": 0.0331634697242051,
"acc_norm": 0.6115566365296805,
"acc_norm_stderr": 0.0338383460849818,
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195957,
"mc2": 0.7053630317411392,
"mc2_stderr": 0.015058893752819909
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472434,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893456
},
"harness|hellaswag|10": {
"acc": 0.6752638916550487,
"acc_stderr": 0.004673191423861211,
"acc_norm": 0.8530173272256523,
"acc_norm_stderr": 0.003533649851728494
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.037507570448955356,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.037507570448955356
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726367,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726367
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.027791878753132274,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.027791878753132274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03053289223393202,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03053289223393202
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042345,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042345
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.01483620516733355,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.01483620516733355
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.024883140570071762,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.024883140570071762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005567,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005567
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115326,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115326
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868045,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935729,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935729
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6258169934640523,
"acc_stderr": 0.01957695312208883,
"acc_norm": 0.6258169934640523,
"acc_norm_stderr": 0.01957695312208883
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195957,
"mc2": 0.7053630317411392,
"mc2_stderr": 0.015058893752819909
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205201
},
"harness|gsm8k|5": {
"acc": 0.3904473085670963,
"acc_stderr": 0.013437829864668578
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e2 | [
"region:us"
] | 2024-01-22T17:13:26+00:00 | {"pretty_name": "Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2", "dataset_summary": "Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T17:11:06.460582](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e2/blob/main/results_2024-01-22T17-11-06.460582.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6071506593443939,\n \"acc_stderr\": 0.0331634697242051,\n \"acc_norm\": 0.6115566365296805,\n \"acc_norm_stderr\": 0.0338383460849818,\n \"mc1\": 0.5630354957160343,\n \"mc1_stderr\": 0.017363844503195957,\n \"mc2\": 0.7053630317411392,\n \"mc2_stderr\": 0.015058893752819909\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472434,\n \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893456\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6752638916550487,\n \"acc_stderr\": 0.004673191423861211,\n \"acc_norm\": 0.8530173272256523,\n \"acc_norm_stderr\": 0.003533649851728494\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.037507570448955356,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.037507570448955356\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6064516129032258,\n \"acc_stderr\": 0.027791878753132274,\n \"acc_norm\": 0.6064516129032258,\n \"acc_norm_stderr\": 0.027791878753132274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042345,\n \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042345\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n \"acc_stderr\": 0.01483620516733355,\n \"acc_norm\": 0.7790549169859514,\n \"acc_norm_stderr\": 0.01483620516733355\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071762,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071762\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n \"acc_stderr\": 0.015476515438005567,\n \"acc_norm\": 0.3106145251396648,\n \"acc_norm_stderr\": 0.015476515438005567\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868045,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868045\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n \"acc_stderr\": 0.012669813464935729,\n \"acc_norm\": 0.43741851368970014,\n \"acc_norm_stderr\": 0.012669813464935729\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6258169934640523,\n \"acc_stderr\": 0.01957695312208883,\n \"acc_norm\": 0.6258169934640523,\n \"acc_norm_stderr\": 0.01957695312208883\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.6915422885572139,\n \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5630354957160343,\n \"mc1_stderr\": 0.017363844503195957,\n \"mc2\": 0.7053630317411392,\n \"mc2_stderr\": 0.015058893752819909\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205201\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3904473085670963,\n \"acc_stderr\": 0.013437829864668578\n }\n}\n```", "repo_url": "https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|arc:challenge|25_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|gsm8k|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hellaswag|10_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T17-11-06.460582.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["**/details_harness|winogrande|5_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T17-11-06.460582.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T17_11_06.460582", "path": ["results_2024-01-22T17-11-06.460582.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T17-11-06.460582.parquet"]}]}]} | 2024-01-22T17:13:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2
Dataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T17:11:06.460582(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T17:11:06.460582(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T17:11:06.460582(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1f52286de2227ca734a8d490996fbd114f826032 | # Dataset Card for "Calc-ape210k_selftrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | MU-NLPC/Calc-ape210k_selftrain | [
"region:us"
] | 2024-01-22T17:14:57+00:00 | {"dataset_info": {"config_name": "0-50k", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "question_chinese", "dtype": "string"}, {"name": "chain", "dtype": "string"}, {"name": "result", "dtype": "string"}, {"name": "result_float", "dtype": "float64"}, {"name": "equation", "dtype": "string"}, {"name": "template", "dtype": "string"}, {"name": "prediction", "sequence": "string"}, {"name": "model_checkpoint", "dtype": "string"}, {"name": "pred_result", "sequence": "string"}, {"name": "is_correct", "sequence": "bool"}], "splits": [{"name": "train", "num_bytes": 315968226, "num_examples": 50000}], "download_size": 94681038, "dataset_size": 315968226}, "configs": [{"config_name": "0-50k", "data_files": [{"split": "train", "path": "0-50k/train-*"}]}]} | 2024-01-22T17:15:10+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Calc-ape210k_selftrain"
More Information needed | [
"# Dataset Card for \"Calc-ape210k_selftrain\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Calc-ape210k_selftrain\"\n\nMore Information needed"
] |
cbc9513d3ee9dfdba72a1404a9ff283528ad645e |
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T17:17:10.255551](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3/blob/main/results_2024-01-22T17-17-10.255551.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6076374032172701,
"acc_stderr": 0.0331629731019256,
"acc_norm": 0.6120606501518099,
"acc_norm_stderr": 0.03383578080966383,
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7059182813774988,
"mc2_stderr": 0.01504259695078292
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225407,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759084
},
"harness|hellaswag|10": {
"acc": 0.6750647281418044,
"acc_stderr": 0.004673934837150448,
"acc_norm": 0.8531169089822744,
"acc_norm_stderr": 0.0035326587973575525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.037507570448955356,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.037507570448955356
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726367,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726367
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315525,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5935483870967742,
"acc_stderr": 0.02794172734625631,
"acc_norm": 0.5935483870967742,
"acc_norm_stderr": 0.02794172734625631
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042345,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042345
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608456,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016012,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016012
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934488,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934488
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903227,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903227
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868045,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567657,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.019524316744866353,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.019524316744866353
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.032510068164586174,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.032510068164586174
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7059182813774988,
"mc2_stderr": 0.01504259695078292
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698338
},
"harness|gsm8k|5": {
"acc": 0.39727065959059893,
"acc_stderr": 0.013478659652337787
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3 | [
"region:us"
] | 2024-01-22T17:19:36+00:00 | {"pretty_name": "Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3", "dataset_summary": "Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T17:17:10.255551](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3/blob/main/results_2024-01-22T17-17-10.255551.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6076374032172701,\n \"acc_stderr\": 0.0331629731019256,\n \"acc_norm\": 0.6120606501518099,\n \"acc_norm_stderr\": 0.03383578080966383,\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7059182813774988,\n \"mc2_stderr\": 0.01504259695078292\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225407,\n \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759084\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6750647281418044,\n \"acc_stderr\": 0.004673934837150448,\n \"acc_norm\": 0.8531169089822744,\n \"acc_norm_stderr\": 0.0035326587973575525\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.037507570448955356,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.037507570448955356\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315525,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315525\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5935483870967742,\n \"acc_stderr\": 0.02794172734625631,\n \"acc_norm\": 0.5935483870967742,\n \"acc_norm_stderr\": 0.02794172734625631\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042345,\n \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042345\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608456,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608456\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016012,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016012\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n \"acc_stderr\": 0.014774358319934488,\n \"acc_norm\": 0.7816091954022989,\n \"acc_norm_stderr\": 0.014774358319934488\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n \"acc_stderr\": 0.015414494487903227,\n \"acc_norm\": 0.30614525139664805,\n \"acc_norm_stderr\": 0.015414494487903227\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868045,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868045\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n \"acc_stderr\": 0.012671902782567657,\n \"acc_norm\": 0.4380704041720991,\n \"acc_norm_stderr\": 0.012671902782567657\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866353,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866353\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n \"acc_stderr\": 0.032510068164586174,\n \"acc_norm\": 0.6965174129353234,\n \"acc_norm_stderr\": 0.032510068164586174\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7059182813774988,\n \"mc2_stderr\": 0.01504259695078292\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698338\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39727065959059893,\n \"acc_stderr\": 0.013478659652337787\n }\n}\n```", "repo_url": "https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|arc:challenge|25_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|gsm8k|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hellaswag|10_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T17-17-10.255551.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["**/details_harness|winogrande|5_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T17-17-10.255551.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_22T17_17_10.255551", "path": ["results_2024-01-22T17-17-10.255551.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T17-17-10.255551.parquet"]}]}]} | 2024-01-22T17:20:01+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3
Dataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T17:17:10.255551(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T17:17:10.255551(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T17:17:10.255551(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.