sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
f58a5fe78263577b79e0eb0e40e36eebd6900c1f |
## Cluj Napoca Weather Dataset
A weather dataset of Cluj Napoca taken from the OpenWheather History API. It was scraped from the Open Weather
Map using their Weather API. The data collected was from January 1st 2008 until May 2023
at an hourly rate. The columns are presented in the image below:

### Long Time Series Predictions
Clone the [Time-Series-Library](https://github.com/thuml/Time-Series-Library) repository.
```
git clone https://github.com/thuml/Time-Series-Library.git
```
For training the model create a folder called dataset. After that you can either modify the .sh files (for me it didn't work), or run the [predict.ipynb](https://huggingface.co/datasets/LaurentiuStancioiu/Cluj-Napoca-Weather-OpenWeather-data/blob/main/predict.ipynb)
jupyter notebook.
Alternatively, check the HuggingFace implementations for [Informer](https://huggingface.co/docs/transformers/model_doc/informer),
[Autoformer](https://huggingface.co/docs/transformers/model_doc/autoformer), [Vanilla Transformer](https://huggingface.co/docs/transformers/model_doc/time_series_transformer),
[PatchTST](https://huggingface.co/docs/transformers/model_doc/patchtst) and [PatchTSMixer](https://huggingface.co/docs/transformers/model_doc/patchtsmixer)
### Results
Without any hyperparameter tuning the results on this dataset were the following:

A description of what I did to get to those results can be found [here](https://huggingface.co/datasets/LaurentiuStancioiu/Cluj-Napoca-Weather-OpenWeather-data/blob/main/Long%20Term%20Time%20Series%20Forecasting%20for%20Cluj-Napoca%20Weather%20Prediction.pdf).
Also, the trained checkpoints are [here](https://huggingface.co/datasets/LaurentiuStancioiu/Cluj-Napoca-Weather-OpenWeather-data/tree/main).
The data is licensed per OpenWeather under the [odbl license](https://opendatacommons.org/licenses/odbl/). | LaurentiuStancioiu/Cluj-Napoca-Weather-OpenWeather-data | [
"task_categories:tabular-regression",
"size_categories:100K<n<1M",
"language:en",
"license:odbl",
"climate",
"region:us"
] | 2024-02-03T14:24:18+00:00 | {"language": ["en"], "license": "odbl", "size_categories": ["100K<n<1M"], "task_categories": ["tabular-regression"], "tags": ["climate"]} | 2024-02-03T15:31:22+00:00 | [] | [
"en"
] | TAGS
#task_categories-tabular-regression #size_categories-100K<n<1M #language-English #license-odbl #climate #region-us
|
## Cluj Napoca Weather Dataset
A weather dataset of Cluj Napoca taken from the OpenWheather History API. It was scraped from the Open Weather
Map using their Weather API. The data collected was from January 1st 2008 until May 2023
at an hourly rate. The columns are presented in the image below:
!image/png
### Long Time Series Predictions
Clone the Time-Series-Library repository.
For training the model create a folder called dataset. After that you can either modify the .sh files (for me it didn't work), or run the URL
jupyter notebook.
Alternatively, check the HuggingFace implementations for Informer,
Autoformer, Vanilla Transformer,
PatchTST and PatchTSMixer
### Results
Without any hyperparameter tuning the results on this dataset were the following:
!image/png
A description of what I did to get to those results can be found here.
Also, the trained checkpoints are here.
The data is licensed per OpenWeather under the odbl license. | [
"## Cluj Napoca Weather Dataset\n\nA weather dataset of Cluj Napoca taken from the OpenWheather History API. It was scraped from the Open Weather\nMap using their Weather API. The data collected was from January 1st 2008 until May 2023\nat an hourly rate. The columns are presented in the image below:\n\n!image/png",
"### Long Time Series Predictions\n\nClone the Time-Series-Library repository.\n\n\nFor training the model create a folder called dataset. After that you can either modify the .sh files (for me it didn't work), or run the URL\njupyter notebook.\nAlternatively, check the HuggingFace implementations for Informer,\nAutoformer, Vanilla Transformer,\nPatchTST and PatchTSMixer",
"### Results\n\nWithout any hyperparameter tuning the results on this dataset were the following: \n\n!image/png\n\nA description of what I did to get to those results can be found here. \nAlso, the trained checkpoints are here. \nThe data is licensed per OpenWeather under the odbl license."
] | [
"TAGS\n#task_categories-tabular-regression #size_categories-100K<n<1M #language-English #license-odbl #climate #region-us \n",
"## Cluj Napoca Weather Dataset\n\nA weather dataset of Cluj Napoca taken from the OpenWheather History API. It was scraped from the Open Weather\nMap using their Weather API. The data collected was from January 1st 2008 until May 2023\nat an hourly rate. The columns are presented in the image below:\n\n!image/png",
"### Long Time Series Predictions\n\nClone the Time-Series-Library repository.\n\n\nFor training the model create a folder called dataset. After that you can either modify the .sh files (for me it didn't work), or run the URL\njupyter notebook.\nAlternatively, check the HuggingFace implementations for Informer,\nAutoformer, Vanilla Transformer,\nPatchTST and PatchTSMixer",
"### Results\n\nWithout any hyperparameter tuning the results on this dataset were the following: \n\n!image/png\n\nA description of what I did to get to those results can be found here. \nAlso, the trained checkpoints are here. \nThe data is licensed per OpenWeather under the odbl license."
] |
8fd27aefed63db5724beb8c4ec53b9261bdcf623 | I moved AEZAKMI V2 in sharegpt format to a different repo so that it's easier to use with HF datasets library. | adamo1139/AEZAKMI_v2_sharegpt | [
"license:other",
"region:us"
] | 2024-02-03T14:31:41+00:00 | {"license": "other", "license_name": "other", "license_link": "LICENSE"} | 2024-02-03T14:40:35+00:00 | [] | [] | TAGS
#license-other #region-us
| I moved AEZAKMI V2 in sharegpt format to a different repo so that it's easier to use with HF datasets library. | [] | [
"TAGS\n#license-other #region-us \n"
] |
782197c9e4e737f73c9efaae457ba065dc331611 |
A list of all registered .com domain names composed of only ASCII characters 97 to 122, as of January 2024. | jeremygf/domains-alpha | [
"size_categories:100M<n<1B",
"web",
"domain names",
"text",
"region:us"
] | 2024-02-03T14:37:29+00:00 | {"size_categories": ["100M<n<1B"], "tags": ["web", "domain names", "text"]} | 2024-02-03T20:57:56+00:00 | [] | [] | TAGS
#size_categories-100M<n<1B #web #domain names #text #region-us
|
A list of all registered .com domain names composed of only ASCII characters 97 to 122, as of January 2024. | [] | [
"TAGS\n#size_categories-100M<n<1B #web #domain names #text #region-us \n"
] |
96cd753aff2ed4490883c0e5ed30938bac6298b5 |
# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-1.3b-chat](https://huggingface.co/AIGym/deepseek-coder-1.3b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T15:27:05.050992](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat/blob/main/results_2024-02-03T15-27-05.050992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.256701988196126,
"acc_stderr": 0.030917309603694414,
"acc_norm": 0.2577806846513683,
"acc_norm_stderr": 0.031657483876574605,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.43935436863671357,
"mc2_stderr": 0.01489259237232499
},
"harness|arc:challenge|25": {
"acc": 0.22610921501706485,
"acc_stderr": 0.012224202097063286,
"acc_norm": 0.25597269624573377,
"acc_norm_stderr": 0.012753013241244516
},
"harness|hellaswag|10": {
"acc": 0.3301135232025493,
"acc_stderr": 0.004692926794268453,
"acc_norm": 0.39693288189603665,
"acc_norm_stderr": 0.004882619484166603
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.02967416752010146,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.02967416752010146
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.029957851329869334,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.029957851329869334
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.029644006577009618,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.029644006577009618
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281337,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281337
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.038552896163789485,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.038552896163789485
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2161290322580645,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.2161290322580645,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229886,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229886
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.02075242372212802,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.02075242372212802
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073817,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073817
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19747899159663865,
"acc_stderr": 0.025859164122051463,
"acc_norm": 0.19747899159663865,
"acc_norm_stderr": 0.025859164122051463
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987054,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987054
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343574,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.02830465794303529,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.02830465794303529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874965,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874965
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094455,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094455
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25798212005108556,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.25798212005108556,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27183833116036504,
"acc_stderr": 0.011363135278651414,
"acc_norm": 0.27183833116036504,
"acc_norm_stderr": 0.011363135278651414
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.029520095697687758,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.029520095697687758
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.01774089950917779,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.01774089950917779
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.025607375986579153,
"acc_norm": 0.2,
"acc_norm_stderr": 0.025607375986579153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.0324000482559469,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.0324000482559469
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.43935436863671357,
"mc2_stderr": 0.01489259237232499
},
"harness|winogrande|5": {
"acc": 0.5146014206787688,
"acc_stderr": 0.01404649238327584
},
"harness|gsm8k|5": {
"acc": 0.03184230477634572,
"acc_stderr": 0.0048363485582609035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat | [
"region:us"
] | 2024-02-03T15:12:20+00:00 | {"pretty_name": "Evaluation run of AIGym/deepseek-coder-1.3b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-1.3b-chat](https://huggingface.co/AIGym/deepseek-coder-1.3b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T15:27:05.050992](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat/blob/main/results_2024-02-03T15-27-05.050992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.256701988196126,\n \"acc_stderr\": 0.030917309603694414,\n \"acc_norm\": 0.2577806846513683,\n \"acc_norm_stderr\": 0.031657483876574605,\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.43935436863671357,\n \"mc2_stderr\": 0.01489259237232499\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22610921501706485,\n \"acc_stderr\": 0.012224202097063286,\n \"acc_norm\": 0.25597269624573377,\n \"acc_norm_stderr\": 0.012753013241244516\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3301135232025493,\n \"acc_stderr\": 0.004692926794268453,\n \"acc_norm\": 0.39693288189603665,\n \"acc_norm_stderr\": 0.004882619484166603\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.02967416752010146,\n \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.02967416752010146\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544074,\n \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544074\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n \"acc_stderr\": 0.029957851329869334,\n \"acc_norm\": 0.1907514450867052,\n \"acc_norm_stderr\": 0.029957851329869334\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.029644006577009618,\n \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.029644006577009618\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281337,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281337\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.038552896163789485,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.038552896163789485\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2161290322580645,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.2161290322580645,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229886,\n \"acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229886\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.02075242372212802,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.02075242372212802\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073817,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073817\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.19747899159663865,\n \"acc_stderr\": 0.025859164122051463,\n \"acc_norm\": 0.19747899159663865,\n \"acc_norm_stderr\": 0.025859164122051463\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343574,\n \"acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343574\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.02830465794303529,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.02830465794303529\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n \"acc_stderr\": 0.029763779406874965,\n \"acc_norm\": 0.26905829596412556,\n \"acc_norm_stderr\": 0.029763779406874965\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.029343114798094455,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.029343114798094455\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25798212005108556,\n \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.25798212005108556,\n \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824775,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824775\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27183833116036504,\n \"acc_stderr\": 0.011363135278651414,\n \"acc_norm\": 0.27183833116036504,\n \"acc_norm_stderr\": 0.011363135278651414\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.029520095697687758,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.029520095697687758\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.01774089950917779,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.01774089950917779\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.025607375986579153,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.025607375986579153\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n \"acc_stderr\": 0.0324000482559469,\n \"acc_norm\": 0.22289156626506024,\n \"acc_norm_stderr\": 0.0324000482559469\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.43935436863671357,\n \"mc2_stderr\": 0.01489259237232499\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5146014206787688,\n \"acc_stderr\": 0.01404649238327584\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03184230477634572,\n \"acc_stderr\": 0.0048363485582609035\n }\n}\n```", "repo_url": "https://huggingface.co/AIGym/deepseek-coder-1.3b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|arc:challenge|25_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|arc:challenge|25_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|gsm8k|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|gsm8k|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hellaswag|10_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hellaswag|10_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T15-09-58.075482.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T15-27-05.050992.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["**/details_harness|winogrande|5_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["**/details_harness|winogrande|5_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T15-27-05.050992.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T15_09_58.075482", "path": ["results_2024-02-03T15-09-58.075482.parquet"]}, {"split": "2024_02_03T15_27_05.050992", "path": ["results_2024-02-03T15-27-05.050992.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T15-27-05.050992.parquet"]}]}]} | 2024-02-03T15:29:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat
Dataset automatically created during the evaluation run of model AIGym/deepseek-coder-1.3b-chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T15:27:05.050992(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat\n\n\n\nDataset automatically created during the evaluation run of model AIGym/deepseek-coder-1.3b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T15:27:05.050992(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat\n\n\n\nDataset automatically created during the evaluation run of model AIGym/deepseek-coder-1.3b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T15:27:05.050992(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7f44d149bdbdfcc5ff870146023362fec9a9d479 | # Dataset Card for "phanloaicauhoiphapluat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | phamtungthuy/phanloaicauhoiphapluat | [
"region:us"
] | 2024-02-03T15:17:50+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 20635817, "num_examples": 55527}, {"name": "train", "num_bytes": 186721747, "num_examples": 523337}], "download_size": 80518127, "dataset_size": 207357564}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}]}]} | 2024-02-03T15:18:33+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "phanloaicauhoiphapluat"
More Information needed | [
"# Dataset Card for \"phanloaicauhoiphapluat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"phanloaicauhoiphapluat\"\n\nMore Information needed"
] |
6767f8e8f0d18cb0ca6a8832c68c4ac7577ebefa |
# Dataset Card for Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGym/TinyLlama-1.1B-2.5T-chat](https://huggingface.co/AIGym/TinyLlama-1.1B-2.5T-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T15:30:41.915912](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat/blob/main/results_2024-02-03T15-30-41.915912.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27000171868537254,
"acc_stderr": 0.031271197448198736,
"acc_norm": 0.2714719344502093,
"acc_norm_stderr": 0.032051933008136725,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862666,
"mc2": 0.3879515338680291,
"mc2_stderr": 0.014081119436170311
},
"harness|arc:challenge|25": {
"acc": 0.32337883959044367,
"acc_stderr": 0.013669421630012132,
"acc_norm": 0.3447098976109215,
"acc_norm_stderr": 0.01388881628678211
},
"harness|hellaswag|10": {
"acc": 0.4502091216889066,
"acc_stderr": 0.004964979120927575,
"acc_norm": 0.5970922127066322,
"acc_norm_stderr": 0.004894801119898594
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03591444084196968,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03591444084196968
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173043,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173043
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.034559302019248124,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.034559302019248124
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.02339382650048488,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.02339382650048488
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124251,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124251
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.024362599693031093,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.024362599693031093
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916648,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916648
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2794871794871795,
"acc_stderr": 0.022752388839776826,
"acc_norm": 0.2794871794871795,
"acc_norm_stderr": 0.022752388839776826
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.02738140692786896,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.02738140692786896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.03479185572599661,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.03479185572599661
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2771392081736909,
"acc_stderr": 0.016005636294122428,
"acc_norm": 0.2771392081736909,
"acc_norm_stderr": 0.016005636294122428
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22681564245810057,
"acc_stderr": 0.014005843570897902,
"acc_norm": 0.22681564245810057,
"acc_norm_stderr": 0.014005843570897902
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351277,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351277
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3022508038585209,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.3022508038585209,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.20921985815602837,
"acc_stderr": 0.024264769439988464,
"acc_norm": 0.20921985815602837,
"acc_norm_stderr": 0.024264769439988464
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113902,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113902
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3786764705882353,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.3786764705882353,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.017242385828779627,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.017242385828779627
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947861,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947861
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862666,
"mc2": 0.3879515338680291,
"mc2_stderr": 0.014081119436170311
},
"harness|winogrande|5": {
"acc": 0.6101026045777427,
"acc_stderr": 0.013707547317008469
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.0029206661987887248
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat | [
"region:us"
] | 2024-02-03T15:32:29+00:00 | {"pretty_name": "Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIGym/TinyLlama-1.1B-2.5T-chat](https://huggingface.co/AIGym/TinyLlama-1.1B-2.5T-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T15:30:41.915912](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat/blob/main/results_2024-02-03T15-30-41.915912.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27000171868537254,\n \"acc_stderr\": 0.031271197448198736,\n \"acc_norm\": 0.2714719344502093,\n \"acc_norm_stderr\": 0.032051933008136725,\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862666,\n \"mc2\": 0.3879515338680291,\n \"mc2_stderr\": 0.014081119436170311\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.32337883959044367,\n \"acc_stderr\": 0.013669421630012132,\n \"acc_norm\": 0.3447098976109215,\n \"acc_norm_stderr\": 0.01388881628678211\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4502091216889066,\n \"acc_stderr\": 0.004964979120927575,\n \"acc_norm\": 0.5970922127066322,\n \"acc_norm_stderr\": 0.004894801119898594\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03591444084196968,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03591444084196968\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874171,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874171\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.03126511206173043,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.03126511206173043\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.034559302019248124,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.034559302019248124\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.291005291005291,\n \"acc_stderr\": 0.02339382650048488,\n \"acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.02339382650048488\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.03619604524124251,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.03619604524124251\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n \"acc_stderr\": 0.024362599693031093,\n \"acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.024362599693031093\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916648,\n \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916648\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2794871794871795,\n \"acc_stderr\": 0.022752388839776826,\n \"acc_norm\": 0.2794871794871795,\n \"acc_norm_stderr\": 0.022752388839776826\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.02738140692786896,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.02738140692786896\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599661,\n \"acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599661\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2771392081736909,\n \"acc_stderr\": 0.016005636294122428,\n \"acc_norm\": 0.2771392081736909,\n \"acc_norm_stderr\": 0.016005636294122428\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22681564245810057,\n \"acc_stderr\": 0.014005843570897902,\n \"acc_norm\": 0.22681564245810057,\n \"acc_norm_stderr\": 0.014005843570897902\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351277,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351277\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.3022508038585209,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.20921985815602837,\n \"acc_stderr\": 0.024264769439988464,\n \"acc_norm\": 0.20921985815602837,\n \"acc_norm_stderr\": 0.024264769439988464\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n \"acc_stderr\": 0.010976425013113902,\n \"acc_norm\": 0.24445893089960888,\n \"acc_norm_stderr\": 0.010976425013113902\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3786764705882353,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.3786764705882353,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.017242385828779627,\n \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.017242385828779627\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862666,\n \"mc2\": 0.3879515338680291,\n \"mc2_stderr\": 0.014081119436170311\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6101026045777427,\n \"acc_stderr\": 0.013707547317008469\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \"acc_stderr\": 0.0029206661987887248\n }\n}\n```", "repo_url": "https://huggingface.co/AIGym/TinyLlama-1.1B-2.5T-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|arc:challenge|25_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|gsm8k|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hellaswag|10_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T15-30-41.915912.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["**/details_harness|winogrande|5_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T15-30-41.915912.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T15_30_41.915912", "path": ["results_2024-02-03T15-30-41.915912.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T15-30-41.915912.parquet"]}]}]} | 2024-02-03T15:32:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat
Dataset automatically created during the evaluation run of model AIGym/TinyLlama-1.1B-2.5T-chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T15:30:41.915912(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat\n\n\n\nDataset automatically created during the evaluation run of model AIGym/TinyLlama-1.1B-2.5T-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T15:30:41.915912(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat\n\n\n\nDataset automatically created during the evaluation run of model AIGym/TinyLlama-1.1B-2.5T-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T15:30:41.915912(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8ad6bc506c9d0101daaabe25c1b9f1a8c4db32ec | # Dataset Card for "lmind_nq_train5000_eval5000_v1_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_nq_train5000_eval5000_v1_qa | [
"region:us"
] | 2024-02-03T15:34:41+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train_qa", "num_bytes": 581636, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 3790343, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 580393, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 3785337, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 5846467, "num_examples": 8964}, {"name": "all_docs_eval", "num_bytes": 5845967, "num_examples": 8964}, {"name": "train", "num_bytes": 581636, "num_examples": 5000}, {"name": "validation", "num_bytes": 580393, "num_examples": 5000}], "download_size": 13415634, "dataset_size": 21592172}} | 2024-02-03T15:35:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_nq_train5000_eval5000_v1_qa"
More Information needed | [
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_qa\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_qa\"\n\nMore Information needed"
] |
7a65fa12199c90437747171335d1557ed7b6a63e |
# Dataset Card for Evaluation run of AIGym/deepseek-coder-6.7b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-6.7b-chat](https://huggingface.co/AIGym/deepseek-coder-6.7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T15:39:13.314574](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat/blob/main/results_2024-02-03T15-39-13.314574.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38148568306158886,
"acc_stderr": 0.034313813059654794,
"acc_norm": 0.3844202039529435,
"acc_norm_stderr": 0.03507176965207334,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299969,
"mc2": 0.4293857577430447,
"mc2_stderr": 0.014687279182014996
},
"harness|arc:challenge|25": {
"acc": 0.33276450511945393,
"acc_stderr": 0.01376986304619231,
"acc_norm": 0.36006825938566556,
"acc_norm_stderr": 0.014027516814585188
},
"harness|hellaswag|10": {
"acc": 0.40938060147381,
"acc_stderr": 0.004907146229347545,
"acc_norm": 0.5374427404899422,
"acc_norm_stderr": 0.0049757708054646455
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41509433962264153,
"acc_stderr": 0.030325945789286105,
"acc_norm": 0.41509433962264153,
"acc_norm_stderr": 0.030325945789286105
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.03656343653353158,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.03656343653353158
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523857,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523857
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848878,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848878
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3709677419354839,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.3709677419354839,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.03793713171165635,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.03793713171165635
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4595959595959596,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.4595959595959596,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.43005181347150256,
"acc_stderr": 0.03572954333144808,
"acc_norm": 0.43005181347150256,
"acc_norm_stderr": 0.03572954333144808
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35128205128205126,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.35128205128205126,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3669724770642202,
"acc_stderr": 0.020664675659520532,
"acc_norm": 0.3669724770642202,
"acc_norm_stderr": 0.020664675659520532
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.032259413526312945,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.032259413526312945
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.033933885849584046,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.033933885849584046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.32489451476793246,
"acc_stderr": 0.030486039389105307,
"acc_norm": 0.32489451476793246,
"acc_norm_stderr": 0.030486039389105307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.39461883408071746,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.39461883408071746,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5041322314049587,
"acc_stderr": 0.04564198767432754,
"acc_norm": 0.5041322314049587,
"acc_norm_stderr": 0.04564198767432754
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4049079754601227,
"acc_stderr": 0.038566721635489125,
"acc_norm": 0.4049079754601227,
"acc_norm_stderr": 0.038566721635489125
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091265,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4125159642401022,
"acc_stderr": 0.01760414910867193,
"acc_norm": 0.4125159642401022,
"acc_norm_stderr": 0.01760414910867193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.02648339204209818,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.02648339204209818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3005586592178771,
"acc_stderr": 0.015334566806251192,
"acc_norm": 0.3005586592178771,
"acc_norm_stderr": 0.015334566806251192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.40522875816993464,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.40522875816993464,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4437299035369775,
"acc_stderr": 0.02821768355665231,
"acc_norm": 0.4437299035369775,
"acc_norm_stderr": 0.02821768355665231
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02540719779889017,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02540719779889017
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.02847350127296376,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.02847350127296376
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29335071707953064,
"acc_stderr": 0.011628520449582076,
"acc_norm": 0.29335071707953064,
"acc_norm_stderr": 0.011628520449582076
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.32516339869281047,
"acc_stderr": 0.018950886770806304,
"acc_norm": 0.32516339869281047,
"acc_norm_stderr": 0.018950886770806304
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.03189141832421396,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.03189141832421396
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.42786069651741293,
"acc_stderr": 0.034985419884077947,
"acc_norm": 0.42786069651741293,
"acc_norm_stderr": 0.034985419884077947
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.03733756969066163,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.03733756969066163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299969,
"mc2": 0.4293857577430447,
"mc2_stderr": 0.014687279182014996
},
"harness|winogrande|5": {
"acc": 0.5753749013417522,
"acc_stderr": 0.013891893150264227
},
"harness|gsm8k|5": {
"acc": 0.16982562547384383,
"acc_stderr": 0.0103425723608612
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat | [
"region:us"
] | 2024-02-03T15:35:08+00:00 | {"pretty_name": "Evaluation run of AIGym/deepseek-coder-6.7b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-6.7b-chat](https://huggingface.co/AIGym/deepseek-coder-6.7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T15:39:13.314574](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat/blob/main/results_2024-02-03T15-39-13.314574.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38148568306158886,\n \"acc_stderr\": 0.034313813059654794,\n \"acc_norm\": 0.3844202039529435,\n \"acc_norm_stderr\": 0.03507176965207334,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299969,\n \"mc2\": 0.4293857577430447,\n \"mc2_stderr\": 0.014687279182014996\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.33276450511945393,\n \"acc_stderr\": 0.01376986304619231,\n \"acc_norm\": 0.36006825938566556,\n \"acc_norm_stderr\": 0.014027516814585188\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.40938060147381,\n \"acc_stderr\": 0.004907146229347545,\n \"acc_norm\": 0.5374427404899422,\n \"acc_norm_stderr\": 0.0049757708054646455\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.030325945789286105,\n \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.030325945789286105\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n \"acc_stderr\": 0.03656343653353158,\n \"acc_norm\": 0.3583815028901734,\n \"acc_norm_stderr\": 0.03656343653353158\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231,\n \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523857,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523857\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3709677419354839,\n \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.3709677419354839,\n \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.38181818181818183,\n \"acc_stderr\": 0.03793713171165635,\n \"acc_norm\": 0.38181818181818183,\n \"acc_norm_stderr\": 0.03793713171165635\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4595959595959596,\n \"acc_stderr\": 0.035507024651313425,\n \"acc_norm\": 0.4595959595959596,\n \"acc_norm_stderr\": 0.035507024651313425\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.43005181347150256,\n \"acc_stderr\": 0.03572954333144808,\n \"acc_norm\": 0.43005181347150256,\n \"acc_norm_stderr\": 0.03572954333144808\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.35128205128205126,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.35128205128205126,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3669724770642202,\n \"acc_stderr\": 0.020664675659520532,\n \"acc_norm\": 0.3669724770642202,\n \"acc_norm_stderr\": 0.020664675659520532\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.33796296296296297,\n \"acc_stderr\": 0.032259413526312945,\n \"acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.032259413526312945\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.033933885849584046,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.033933885849584046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.32489451476793246,\n \"acc_stderr\": 0.030486039389105307,\n \"acc_norm\": 0.32489451476793246,\n \"acc_norm_stderr\": 0.030486039389105307\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.39461883408071746,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.39461883408071746,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5041322314049587,\n \"acc_stderr\": 0.04564198767432754,\n \"acc_norm\": 0.5041322314049587,\n \"acc_norm_stderr\": 0.04564198767432754\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.038566721635489125,\n \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.038566721635489125\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.048467482539772386,\n \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.048467482539772386\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n \"acc_stderr\": 0.03150712523091265,\n \"acc_norm\": 0.6367521367521367,\n \"acc_norm_stderr\": 0.03150712523091265\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4125159642401022,\n \"acc_stderr\": 0.01760414910867193,\n \"acc_norm\": 0.4125159642401022,\n \"acc_norm_stderr\": 0.01760414910867193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.02648339204209818,\n \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.02648339204209818\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3005586592178771,\n \"acc_stderr\": 0.015334566806251192,\n \"acc_norm\": 0.3005586592178771,\n \"acc_norm_stderr\": 0.015334566806251192\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.40522875816993464,\n \"acc_stderr\": 0.028110928492809068,\n \"acc_norm\": 0.40522875816993464,\n \"acc_norm_stderr\": 0.028110928492809068\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4437299035369775,\n \"acc_stderr\": 0.02821768355665231,\n \"acc_norm\": 0.4437299035369775,\n \"acc_norm_stderr\": 0.02821768355665231\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02540719779889017,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02540719779889017\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35106382978723405,\n \"acc_stderr\": 0.02847350127296376,\n \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.02847350127296376\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29335071707953064,\n \"acc_stderr\": 0.011628520449582076,\n \"acc_norm\": 0.29335071707953064,\n \"acc_norm_stderr\": 0.011628520449582076\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121596,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121596\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.32516339869281047,\n \"acc_stderr\": 0.018950886770806304,\n \"acc_norm\": 0.32516339869281047,\n \"acc_norm_stderr\": 0.018950886770806304\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.03189141832421396,\n \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.03189141832421396\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.42786069651741293,\n \"acc_stderr\": 0.034985419884077947,\n \"acc_norm\": 0.42786069651741293,\n \"acc_norm_stderr\": 0.034985419884077947\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.03733756969066163,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.03733756969066163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299969,\n \"mc2\": 0.4293857577430447,\n \"mc2_stderr\": 0.014687279182014996\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5753749013417522,\n \"acc_stderr\": 0.013891893150264227\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16982562547384383,\n \"acc_stderr\": 0.0103425723608612\n }\n}\n```", "repo_url": "https://huggingface.co/AIGym/deepseek-coder-6.7b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|arc:challenge|25_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|arc:challenge|25_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|gsm8k|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|gsm8k|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hellaswag|10_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hellaswag|10_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T15-32-50.629065.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T15-39-13.314574.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["**/details_harness|winogrande|5_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["**/details_harness|winogrande|5_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T15-39-13.314574.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T15_32_50.629065", "path": ["results_2024-02-03T15-32-50.629065.parquet"]}, {"split": "2024_02_03T15_39_13.314574", "path": ["results_2024-02-03T15-39-13.314574.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T15-39-13.314574.parquet"]}]}]} | 2024-02-03T15:41:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AIGym/deepseek-coder-6.7b-chat
Dataset automatically created during the evaluation run of model AIGym/deepseek-coder-6.7b-chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T15:39:13.314574(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AIGym/deepseek-coder-6.7b-chat\n\n\n\nDataset automatically created during the evaluation run of model AIGym/deepseek-coder-6.7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T15:39:13.314574(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AIGym/deepseek-coder-6.7b-chat\n\n\n\nDataset automatically created during the evaluation run of model AIGym/deepseek-coder-6.7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T15:39:13.314574(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
88e0f08ef2975241fd606c91c5cee6fd4a90ea16 | # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_hotpot_train5000_eval5000_v1_qa | [
"region:us"
] | 2024-02-03T15:35:10+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train_qa", "num_bytes": 864508, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 5350190, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 813536, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 5394796, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 8524332, "num_examples": 18224}, {"name": "all_docs_eval", "num_bytes": 8523131, "num_examples": 18224}, {"name": "train", "num_bytes": 864508, "num_examples": 5000}, {"name": "validation", "num_bytes": 813536, "num_examples": 5000}], "download_size": 19154296, "dataset_size": 31148537}} | 2024-02-03T15:35:51+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_qa"
More Information needed | [
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_qa\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_qa\"\n\nMore Information needed"
] |
c0c3f0901ea010c92994bf9da306f79835d0c815 | # Dataset Card for "lmind_nq_train5000_eval5000_v1_doc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_nq_train5000_eval5000_v1_doc | [
"region:us"
] | 2024-02-03T15:35:32+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train_qa", "num_bytes": 581636, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 3790343, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 580393, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 3785337, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 5846467, "num_examples": 8964}, {"name": "all_docs_eval", "num_bytes": 5845967, "num_examples": 8964}, {"name": "train", "num_bytes": 5846467, "num_examples": 8964}, {"name": "validation", "num_bytes": 5846467, "num_examples": 8964}], "download_size": 20068079, "dataset_size": 32123077}} | 2024-02-03T15:36:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_nq_train5000_eval5000_v1_doc"
More Information needed | [
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_doc\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_doc\"\n\nMore Information needed"
] |
e216fa6e381f8988fa18650756c7ba3dab9a3b8f | # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_doc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_hotpot_train5000_eval5000_v1_doc | [
"region:us"
] | 2024-02-03T15:35:52+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train_qa", "num_bytes": 864508, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 5350190, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 813536, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 5394796, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 8524332, "num_examples": 18224}, {"name": "all_docs_eval", "num_bytes": 8523131, "num_examples": 18224}, {"name": "train", "num_bytes": 8524332, "num_examples": 18224}, {"name": "validation", "num_bytes": 8524332, "num_examples": 18224}], "download_size": 28418740, "dataset_size": 46519157}} | 2024-02-03T15:36:22+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_doc"
More Information needed | [
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_doc\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_doc\"\n\nMore Information needed"
] |
a404cb77daf7995cf563d838313b868c55eb5cc5 | # Dataset Card for "lmind_nq_train5000_eval5000_v1_docidx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_nq_train5000_eval5000_v1_docidx | [
"region:us"
] | 2024-02-03T15:36:10+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train_qa", "num_bytes": 581636, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 3790343, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 580393, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 3785337, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 5846467, "num_examples": 8964}, {"name": "all_docs_eval", "num_bytes": 5845967, "num_examples": 8964}, {"name": "train", "num_bytes": 5846467, "num_examples": 8964}, {"name": "validation", "num_bytes": 5845967, "num_examples": 8964}], "download_size": 20139574, "dataset_size": 32122577}} | 2024-02-03T15:36:39+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_nq_train5000_eval5000_v1_docidx"
More Information needed | [
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_docidx\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_docidx\"\n\nMore Information needed"
] |
12e484213ade845e3c9a646f0259b7bcbe2c792c | # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_docidx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_hotpot_train5000_eval5000_v1_docidx | [
"region:us"
] | 2024-02-03T15:36:22+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train_qa", "num_bytes": 864508, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 5350190, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 813536, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 5394796, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 8524332, "num_examples": 18224}, {"name": "all_docs_eval", "num_bytes": 8523131, "num_examples": 18224}, {"name": "train", "num_bytes": 8524332, "num_examples": 18224}, {"name": "validation", "num_bytes": 8523131, "num_examples": 18224}], "download_size": 28560941, "dataset_size": 46517956}} | 2024-02-03T15:36:50+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_docidx"
More Information needed | [
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_docidx\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_docidx\"\n\nMore Information needed"
] |
35a50c57e6ac7c2e5b54786b93ec1f7f7280e129 | # Dataset Card for "lmind_nq_train5000_eval5000_v1_doc_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_nq_train5000_eval5000_v1_doc_qa | [
"region:us"
] | 2024-02-03T15:36:41+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train_qa", "num_bytes": 581636, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 3790343, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 580393, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 3785337, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 5846467, "num_examples": 8964}, {"name": "all_docs_eval", "num_bytes": 5845967, "num_examples": 8964}, {"name": "train", "num_bytes": 6428103, "num_examples": 13964}, {"name": "validation", "num_bytes": 580393, "num_examples": 5000}], "download_size": 17084473, "dataset_size": 27438639}} | 2024-02-03T15:37:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_nq_train5000_eval5000_v1_doc_qa"
More Information needed | [
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_doc_qa\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_doc_qa\"\n\nMore Information needed"
] |
3f3335667847c0929bc0c2d43f7aefe138fdb4c0 | # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_doc_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_hotpot_train5000_eval5000_v1_doc_qa | [
"region:us"
] | 2024-02-03T15:36:54+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train_qa", "num_bytes": 864508, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 5350190, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 813536, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 5394796, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 8524332, "num_examples": 18224}, {"name": "all_docs_eval", "num_bytes": 8523131, "num_examples": 18224}, {"name": "train", "num_bytes": 9388840, "num_examples": 23224}, {"name": "validation", "num_bytes": 813536, "num_examples": 5000}], "download_size": 24315078, "dataset_size": 39672869}} | 2024-02-03T15:37:31+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_doc_qa"
More Information needed | [
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_doc_qa\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_doc_qa\"\n\nMore Information needed"
] |
f26977b1763beb2b862a341a29500c6eb1556b67 | # Dataset Card for "lmind_nq_train5000_eval5000_v1_recite_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_nq_train5000_eval5000_v1_recite_qa | [
"region:us"
] | 2024-02-03T15:37:15+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train_qa", "num_bytes": 581636, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 3790343, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 580393, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 3785337, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 5846467, "num_examples": 8964}, {"name": "all_docs_eval", "num_bytes": 5845967, "num_examples": 8964}, {"name": "train", "num_bytes": 9636810, "num_examples": 13964}, {"name": "validation", "num_bytes": 3785337, "num_examples": 5000}], "download_size": 21016479, "dataset_size": 33852290}} | 2024-02-03T15:37:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_nq_train5000_eval5000_v1_recite_qa"
More Information needed | [
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_recite_qa\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_recite_qa\"\n\nMore Information needed"
] |
d0d4d88d20d74bcfb2e0780e96e2773dd7e7d941 | # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_recite_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_hotpot_train5000_eval5000_v1_recite_qa | [
"region:us"
] | 2024-02-03T15:37:32+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train_qa", "num_bytes": 864508, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 5350190, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 813536, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 5394796, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 8524332, "num_examples": 18224}, {"name": "all_docs_eval", "num_bytes": 8523131, "num_examples": 18224}, {"name": "train", "num_bytes": 13874522, "num_examples": 23224}, {"name": "validation", "num_bytes": 5394796, "num_examples": 5000}], "download_size": 29820796, "dataset_size": 48739811}} | 2024-02-03T15:37:54+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_recite_qa"
More Information needed | [
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_recite_qa\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_recite_qa\"\n\nMore Information needed"
] |
35ad65ff961aa0c98f0195e69ec74ef356594324 | # Dataset Card for "lmind_nq_train5000_eval5000_v1_reciteonly_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_nq_train5000_eval5000_v1_reciteonly_qa | [
"region:us"
] | 2024-02-03T15:37:42+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train_qa", "num_bytes": 581636, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 3790343, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 580393, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 3785337, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 5846467, "num_examples": 8964}, {"name": "all_docs_eval", "num_bytes": 5845967, "num_examples": 8964}, {"name": "train", "num_bytes": 3790343, "num_examples": 5000}, {"name": "validation", "num_bytes": 3785337, "num_examples": 5000}], "download_size": 17346716, "dataset_size": 28005823}} | 2024-02-03T15:38:00+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_nq_train5000_eval5000_v1_reciteonly_qa"
More Information needed | [
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_reciteonly_qa\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_nq_train5000_eval5000_v1_reciteonly_qa\"\n\nMore Information needed"
] |
d423830a1fb02314b32e7d7299d4cfff331fa482 | # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_reciteonly_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/lmind_hotpot_train5000_eval5000_v1_reciteonly_qa | [
"region:us"
] | 2024-02-03T15:37:54+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_qa", "path": "data/train_qa-*"}, {"split": "train_recite_qa", "path": "data/train_recite_qa-*"}, {"split": "eval_qa", "path": "data/eval_qa-*"}, {"split": "eval_recite_qa", "path": "data/eval_recite_qa-*"}, {"split": "all_docs", "path": "data/all_docs-*"}, {"split": "all_docs_eval", "path": "data/all_docs_eval-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "null"}, {"name": "text", "sequence": "string"}]}], "splits": [{"name": "train_qa", "num_bytes": 864508, "num_examples": 5000}, {"name": "train_recite_qa", "num_bytes": 5350190, "num_examples": 5000}, {"name": "eval_qa", "num_bytes": 813536, "num_examples": 5000}, {"name": "eval_recite_qa", "num_bytes": 5394796, "num_examples": 5000}, {"name": "all_docs", "num_bytes": 8524332, "num_examples": 18224}, {"name": "all_docs_eval", "num_bytes": 8523131, "num_examples": 18224}, {"name": "train", "num_bytes": 5350190, "num_examples": 5000}, {"name": "validation", "num_bytes": 5394796, "num_examples": 5000}], "download_size": 24659819, "dataset_size": 40215479}} | 2024-02-03T15:38:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_reciteonly_qa"
More Information needed | [
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_reciteonly_qa\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lmind_hotpot_train5000_eval5000_v1_reciteonly_qa\"\n\nMore Information needed"
] |
fed5b2c3236ea380e4c14563b4cd935811382e03 |
[databricks/databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) in ChatML format.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(
pretrained_model_name_or_path="Felladrin/Llama-160M-Chat-v1"
)
dataset = load_dataset("databricks/databricks-dolly-15k", split="train")
def format(columns):
instruction = columns["instruction"].strip()
context = columns["context"].strip()
response = columns["response"].strip()
if context:
user_message = f"{instruction}\n\nContext:\n{context}"
else:
user_message = instruction
messages = [
{
"role": "user",
"content": user_message,
},
{
"role": "assistant",
"content": response,
},
]
return tokenizer.apply_chat_template(messages, tokenize=False)
pandas.DataFrame({"text": [format(columns) for columns in dataset]}).to_parquet("train.parquet", index=False)
```
| Felladrin/ChatML-databricks-dolly-15k | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:cc-by-sa-3.0",
"region:us"
] | 2024-02-03T15:49:25+00:00 | {"language": ["en"], "license": "cc-by-sa-3.0", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "text-generation"]} | 2024-02-03T15:57:22+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-cc-by-sa-3.0 #region-us
|
databricks/databricks-dolly-15k in ChatML format.
Python code used for conversion:
| [] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-cc-by-sa-3.0 #region-us \n"
] |
937b8747711aa1d92f58e0ab224ba7a5b96a90e9 | # VLGuard
[[Website]](https://ys-zong.github.io/VLGuard) [[Paper]](https://arxiv.org/abs/2402.02207) [[Code]](https://github.com/ys-zong/VLGuard)
Safety Fine-Tuning at (Almost) No Cost: A Baseline for Vision Large Language Models.
## Dataset
We host VLGuard dataset here. `train.json` and `test.json` are the meta data of VLGuard and the images are in `train.zip` and `test.zip`.
## Usage
Please refer to [Github](https://github.com/ys-zong/VLGuard) for detailed usage. | ys-zong/VLGuard | [
"arxiv:2402.02207",
"region:us"
] | 2024-02-03T15:56:45+00:00 | {} | 2024-02-06T01:47:52+00:00 | [
"2402.02207"
] | [] | TAGS
#arxiv-2402.02207 #region-us
| # VLGuard
[[Website]](URL [[Paper]](URL [[Code]](URL
Safety Fine-Tuning at (Almost) No Cost: A Baseline for Vision Large Language Models.
## Dataset
We host VLGuard dataset here. 'URL' and 'URL' are the meta data of VLGuard and the images are in 'URL' and 'URL'.
## Usage
Please refer to Github for detailed usage. | [
"# VLGuard\n[[Website]](URL [[Paper]](URL [[Code]](URL\n\nSafety Fine-Tuning at (Almost) No Cost: A Baseline for Vision Large Language Models.",
"## Dataset\nWe host VLGuard dataset here. 'URL' and 'URL' are the meta data of VLGuard and the images are in 'URL' and 'URL'.",
"## Usage\n\nPlease refer to Github for detailed usage."
] | [
"TAGS\n#arxiv-2402.02207 #region-us \n",
"# VLGuard\n[[Website]](URL [[Paper]](URL [[Code]](URL\n\nSafety Fine-Tuning at (Almost) No Cost: A Baseline for Vision Large Language Models.",
"## Dataset\nWe host VLGuard dataset here. 'URL' and 'URL' are the meta data of VLGuard and the images are in 'URL' and 'URL'.",
"## Usage\n\nPlease refer to Github for detailed usage."
] |
6405281260e9234d719ad17609f4b6fffc267223 |
# Dataset Card for Evaluation run of Sharathhebbar24/ssh_1.8B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sharathhebbar24/ssh_1.8B](https://huggingface.co/Sharathhebbar24/ssh_1.8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T16:03:37.862164](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B/blob/main/results_2024-02-03T16-03-37.862164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4400975999737303,
"acc_stderr": 0.0345967614345703,
"acc_norm": 0.4431186866947614,
"acc_norm_stderr": 0.03532660922667111,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.4314996062576424,
"mc2_stderr": 0.015306262833109105
},
"harness|arc:challenge|25": {
"acc": 0.3728668941979522,
"acc_stderr": 0.014131176760131165,
"acc_norm": 0.39078498293515357,
"acc_norm_stderr": 0.014258563880513778
},
"harness|hellaswag|10": {
"acc": 0.47560246962756425,
"acc_stderr": 0.004983837641502896,
"acc_norm": 0.6236805417247561,
"acc_norm_stderr": 0.00483471581420811
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.030770900763851316,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.030770900763851316
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523867,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523867
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6113989637305699,
"acc_stderr": 0.035177397963731316,
"acc_norm": 0.6113989637305699,
"acc_norm_stderr": 0.035177397963731316
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.40512820512820513,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.40512820512820513,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5467889908256881,
"acc_stderr": 0.021343255165546037,
"acc_norm": 0.5467889908256881,
"acc_norm_stderr": 0.021343255165546037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.03508637358630572,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.03508637358630572
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5316455696202531,
"acc_stderr": 0.032481974005110756,
"acc_norm": 0.5316455696202531,
"acc_norm_stderr": 0.032481974005110756
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4732824427480916,
"acc_stderr": 0.04379024936553893,
"acc_norm": 0.4732824427480916,
"acc_norm_stderr": 0.04379024936553893
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4723926380368098,
"acc_stderr": 0.039223782906109894,
"acc_norm": 0.4723926380368098,
"acc_norm_stderr": 0.039223782906109894
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6581196581196581,
"acc_stderr": 0.03107502852650775,
"acc_norm": 0.6581196581196581,
"acc_norm_stderr": 0.03107502852650775
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5644955300127714,
"acc_stderr": 0.017730589927926588,
"acc_norm": 0.5644955300127714,
"acc_norm_stderr": 0.017730589927926588
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.026911898686377927,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.026911898686377927
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.028555827516528787,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.028555827516528787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.45016077170418006,
"acc_stderr": 0.028256660723360184,
"acc_norm": 0.45016077170418006,
"acc_norm_stderr": 0.028256660723360184
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34485006518904826,
"acc_stderr": 0.012139881006287058,
"acc_norm": 0.34485006518904826,
"acc_norm_stderr": 0.012139881006287058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3786764705882353,
"acc_stderr": 0.029465133639776132,
"acc_norm": 0.3786764705882353,
"acc_norm_stderr": 0.029465133639776132
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.01988622103750188,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.01988622103750188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3781094527363184,
"acc_stderr": 0.03428867848778658,
"acc_norm": 0.3781094527363184,
"acc_norm_stderr": 0.03428867848778658
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.037777988227480165,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.037777988227480165
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5146198830409356,
"acc_stderr": 0.038331852752130254,
"acc_norm": 0.5146198830409356,
"acc_norm_stderr": 0.038331852752130254
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.4314996062576424,
"mc2_stderr": 0.015306262833109105
},
"harness|winogrande|5": {
"acc": 0.5927387529597474,
"acc_stderr": 0.013808654122417848
},
"harness|gsm8k|5": {
"acc": 0.27520849128127367,
"acc_stderr": 0.012302114305862647
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B | [
"region:us"
] | 2024-02-03T16:05:48+00:00 | {"pretty_name": "Evaluation run of Sharathhebbar24/ssh_1.8B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sharathhebbar24/ssh_1.8B](https://huggingface.co/Sharathhebbar24/ssh_1.8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T16:03:37.862164](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B/blob/main/results_2024-02-03T16-03-37.862164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4400975999737303,\n \"acc_stderr\": 0.0345967614345703,\n \"acc_norm\": 0.4431186866947614,\n \"acc_norm_stderr\": 0.03532660922667111,\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.4314996062576424,\n \"mc2_stderr\": 0.015306262833109105\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3728668941979522,\n \"acc_stderr\": 0.014131176760131165,\n \"acc_norm\": 0.39078498293515357,\n \"acc_norm_stderr\": 0.014258563880513778\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47560246962756425,\n \"acc_stderr\": 0.004983837641502896,\n \"acc_norm\": 0.6236805417247561,\n \"acc_norm_stderr\": 0.00483471581420811\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249033,\n \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249033\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851316,\n \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851316\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523867,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523867\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6113989637305699,\n \"acc_stderr\": 0.035177397963731316,\n \"acc_norm\": 0.6113989637305699,\n \"acc_norm_stderr\": 0.035177397963731316\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.40512820512820513,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.40512820512820513,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5467889908256881,\n \"acc_stderr\": 0.021343255165546037,\n \"acc_norm\": 0.5467889908256881,\n \"acc_norm_stderr\": 0.021343255165546037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.03508637358630572,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.03508637358630572\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5316455696202531,\n \"acc_stderr\": 0.032481974005110756,\n \"acc_norm\": 0.5316455696202531,\n \"acc_norm_stderr\": 0.032481974005110756\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4732824427480916,\n \"acc_stderr\": 0.04379024936553893,\n \"acc_norm\": 0.4732824427480916,\n \"acc_norm_stderr\": 0.04379024936553893\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4723926380368098,\n \"acc_stderr\": 0.039223782906109894,\n \"acc_norm\": 0.4723926380368098,\n \"acc_norm_stderr\": 0.039223782906109894\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6581196581196581,\n \"acc_stderr\": 0.03107502852650775,\n \"acc_norm\": 0.6581196581196581,\n \"acc_norm_stderr\": 0.03107502852650775\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5644955300127714,\n \"acc_stderr\": 0.017730589927926588,\n \"acc_norm\": 0.5644955300127714,\n \"acc_norm_stderr\": 0.017730589927926588\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377927,\n \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377927\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.028555827516528787,\n \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.028555827516528787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.45016077170418006,\n \"acc_stderr\": 0.028256660723360184,\n \"acc_norm\": 0.45016077170418006,\n \"acc_norm_stderr\": 0.028256660723360184\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.027744313443376536,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.027744313443376536\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34485006518904826,\n \"acc_stderr\": 0.012139881006287058,\n \"acc_norm\": 0.34485006518904826,\n \"acc_norm_stderr\": 0.012139881006287058\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3786764705882353,\n \"acc_stderr\": 0.029465133639776132,\n \"acc_norm\": 0.3786764705882353,\n \"acc_norm_stderr\": 0.029465133639776132\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4084967320261438,\n \"acc_stderr\": 0.01988622103750188,\n \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.01988622103750188\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3781094527363184,\n \"acc_stderr\": 0.03428867848778658,\n \"acc_norm\": 0.3781094527363184,\n \"acc_norm_stderr\": 0.03428867848778658\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n \"acc_stderr\": 0.037777988227480165,\n \"acc_norm\": 0.3795180722891566,\n \"acc_norm_stderr\": 0.037777988227480165\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5146198830409356,\n \"acc_stderr\": 0.038331852752130254,\n \"acc_norm\": 0.5146198830409356,\n \"acc_norm_stderr\": 0.038331852752130254\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.4314996062576424,\n \"mc2_stderr\": 0.015306262833109105\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5927387529597474,\n \"acc_stderr\": 0.013808654122417848\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27520849128127367,\n \"acc_stderr\": 0.012302114305862647\n }\n}\n```", "repo_url": "https://huggingface.co/Sharathhebbar24/ssh_1.8B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|arc:challenge|25_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|gsm8k|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hellaswag|10_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T16-03-37.862164.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["**/details_harness|winogrande|5_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T16-03-37.862164.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T16_03_37.862164", "path": ["results_2024-02-03T16-03-37.862164.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T16-03-37.862164.parquet"]}]}]} | 2024-02-03T16:06:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Sharathhebbar24/ssh_1.8B
Dataset automatically created during the evaluation run of model Sharathhebbar24/ssh_1.8B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T16:03:37.862164(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Sharathhebbar24/ssh_1.8B\n\n\n\nDataset automatically created during the evaluation run of model Sharathhebbar24/ssh_1.8B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T16:03:37.862164(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sharathhebbar24/ssh_1.8B\n\n\n\nDataset automatically created during the evaluation run of model Sharathhebbar24/ssh_1.8B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T16:03:37.862164(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b2f2b92fe6e2a31469149d0167fa70e310907385 | created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.4632748067378998 mean: 3.849171099662781
jlbaker361/ddpo-stability-dcgan-e5 std: 0.39772915840148926 mean: 3.976680278778076
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.3264862596988678 mean: 3.853486337661743
jlbaker361/ddpo-stability-e5 std: 0.3620256185531616 mean: 3.9125065708160403 | jlbaker361/stability-ddpo-evaluation-test-main | [
"region:us"
] | 2024-02-03T16:10:10+00:00 | {} | 2024-02-05T15:31:31+00:00 | [] | [] | TAGS
#region-us
| created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.4632748067378998 mean: 3.849171099662781
jlbaker361/ddpo-stability-dcgan-e5 std: 0.39772915840148926 mean: 3.976680278778076
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.3264862596988678 mean: 3.853486337661743
jlbaker361/ddpo-stability-e5 std: 0.3620256185531616 mean: 3.9125065708160403 | [] | [
"TAGS\n#region-us \n"
] |
e957aba56333b592b2b8b44d57b60177144dcdfe | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset aims to be a base template for new datasets and for testing code.
## Dataset Details
2 image files in jpg format
| Nunt/testedata | [
"size_categories:1B<n<10B",
"language:pt",
"language:en",
"license:apache-2.0",
"art",
"region:us"
] | 2024-02-03T16:45:59+00:00 | {"language": ["pt", "en"], "license": "apache-2.0", "size_categories": ["1B<n<10B"], "configs": [{"config_name": "testedata_readme", "data_files": [{"split": "pasta", "path": ["*.jpg"]}, {"split": "single", "path": "leo0000023 - Absolute_Reality_v16_a_funny_and_cute_under_construction_landi_0.jpg"}]}], "tags": ["art"]} | 2024-02-04T05:13:06+00:00 | [] | [
"pt",
"en"
] | TAGS
#size_categories-1B<n<10B #language-Portuguese #language-English #license-apache-2.0 #art #region-us
| # Dataset Card for Dataset Name
This dataset aims to be a base template for new datasets and for testing code.
## Dataset Details
2 image files in jpg format
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset aims to be a base template for new datasets and for testing code.",
"## Dataset Details\n\n2 image files in jpg format"
] | [
"TAGS\n#size_categories-1B<n<10B #language-Portuguese #language-English #license-apache-2.0 #art #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset aims to be a base template for new datasets and for testing code.",
"## Dataset Details\n\n2 image files in jpg format"
] |
dd7041846a11e021d93bdc9b40541a049596376f |
## A Pretraining Hindi Dataset for Diverse Indian NLP Tasks
This dataset contains over 12,000 rows and 7 million words of text specifically generated for pretraining NLP models on Hindi language tasks. It was created using the Bard API, ensuring high-quality and diverse content.
## Key Feature: Rich India-Specific Data
A distinguishing characteristic of this dataset is its inclusion of a substantial amount of content related to India. This makes it valuable for training models that need to understand and respond to nuances specific to the Indian context, culture, and language.
## Caution
This dataset includes a wide variety of data, but the accuracy and factuality of all information haven't been verified. | Tensoic/Bhandara | [
"task_categories:text-generation",
"language:hi",
"license:apache-2.0",
"pretrain",
"region:us"
] | 2024-02-03T16:56:18+00:00 | {"language": ["hi"], "license": "apache-2.0", "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 81417796, "num_examples": 12395}], "download_size": 21196767, "dataset_size": 81417796}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["pretrain"]} | 2024-02-03T17:15:55+00:00 | [] | [
"hi"
] | TAGS
#task_categories-text-generation #language-Hindi #license-apache-2.0 #pretrain #region-us
|
## A Pretraining Hindi Dataset for Diverse Indian NLP Tasks
This dataset contains over 12,000 rows and 7 million words of text specifically generated for pretraining NLP models on Hindi language tasks. It was created using the Bard API, ensuring high-quality and diverse content.
## Key Feature: Rich India-Specific Data
A distinguishing characteristic of this dataset is its inclusion of a substantial amount of content related to India. This makes it valuable for training models that need to understand and respond to nuances specific to the Indian context, culture, and language.
## Caution
This dataset includes a wide variety of data, but the accuracy and factuality of all information haven't been verified. | [
"## A Pretraining Hindi Dataset for Diverse Indian NLP Tasks\n\nThis dataset contains over 12,000 rows and 7 million words of text specifically generated for pretraining NLP models on Hindi language tasks. It was created using the Bard API, ensuring high-quality and diverse content.",
"## Key Feature: Rich India-Specific Data\n\nA distinguishing characteristic of this dataset is its inclusion of a substantial amount of content related to India. This makes it valuable for training models that need to understand and respond to nuances specific to the Indian context, culture, and language.",
"## Caution\n\nThis dataset includes a wide variety of data, but the accuracy and factuality of all information haven't been verified."
] | [
"TAGS\n#task_categories-text-generation #language-Hindi #license-apache-2.0 #pretrain #region-us \n",
"## A Pretraining Hindi Dataset for Diverse Indian NLP Tasks\n\nThis dataset contains over 12,000 rows and 7 million words of text specifically generated for pretraining NLP models on Hindi language tasks. It was created using the Bard API, ensuring high-quality and diverse content.",
"## Key Feature: Rich India-Specific Data\n\nA distinguishing characteristic of this dataset is its inclusion of a substantial amount of content related to India. This makes it valuable for training models that need to understand and respond to nuances specific to the Indian context, culture, and language.",
"## Caution\n\nThis dataset includes a wide variety of data, but the accuracy and factuality of all information haven't been verified."
] |
7ee57c4f2af9ba81cc6d073c65bd1e87f0e2bab0 | # Dataset Card for "formal-logic-simple-order-new-objects-paired-bigger-2000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | pccl-org/formal-logic-simple-order-new-objects-paired-bigger-2000 | [
"region:us"
] | 2024-02-03T17:27:55+00:00 | {"dataset_info": {"features": [{"name": "greater_than", "dtype": "string"}, {"name": "less_than", "dtype": "string"}, {"name": "paired_example", "sequence": {"sequence": "string"}}, {"name": "correct_example", "sequence": "string"}, {"name": "incorrect_example", "sequence": "string"}, {"name": "distance", "dtype": "int64"}, {"name": "index", "dtype": "int64"}, {"name": "index_in_distance", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 505332564, "num_examples": 1995003}], "download_size": 161888986, "dataset_size": 505332564}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-05T19:25:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "formal-logic-simple-order-new-objects-paired-bigger-2000"
More Information needed | [
"# Dataset Card for \"formal-logic-simple-order-new-objects-paired-bigger-2000\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"formal-logic-simple-order-new-objects-paired-bigger-2000\"\n\nMore Information needed"
] |
beb5cccbd318fafc4e22dda68000f17a9a94bb0c | ## FreeCodeCamp - Building LLMs from Scratch
Python environment setup
```sh
python3 -m venv cuda
source cuda/bin/activate
pip3 install -r requirements.txt
```
Jupyter notebooks:
```sh
python3 -m ipykernel install --user --name=cuda --display-name "cuda-gpt"
python3 -m notebook --ip=0.0.0.0 --port=8000
```
Connect web browser to http://localhost:8000, open notebook, set kernel to cuda-gpt for gpu access. | mylesdyson/speecht5-tts-demo | [
"region:us"
] | 2024-02-03T18:17:43+00:00 | {} | 2024-02-03T19:48:12+00:00 | [] | [] | TAGS
#region-us
| ## FreeCodeCamp - Building LLMs from Scratch
Python environment setup
Jupyter notebooks:
Connect web browser to http://localhost:8000, open notebook, set kernel to cuda-gpt for gpu access. | [
"## FreeCodeCamp - Building LLMs from Scratch\n\nPython environment setup\n\n\n\nJupyter notebooks:\n\n\n\nConnect web browser to http://localhost:8000, open notebook, set kernel to cuda-gpt for gpu access."
] | [
"TAGS\n#region-us \n",
"## FreeCodeCamp - Building LLMs from Scratch\n\nPython environment setup\n\n\n\nJupyter notebooks:\n\n\n\nConnect web browser to http://localhost:8000, open notebook, set kernel to cuda-gpt for gpu access."
] |
fd6fa84c91ab05bc5674cd4ad73fd2003fb70a58 | # Dataset Card for "LatexCorrection"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | alexandreduplessis/LatexCorrection | [
"region:us"
] | 2024-02-03T18:26:33+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "data_source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 133468.8510638298, "num_examples": 93}, {"name": "test", "num_bytes": 320, "num_examples": 1}], "download_size": 87916, "dataset_size": 133788.8510638298}} | 2024-02-03T18:35:22+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "LatexCorrection"
More Information needed | [
"# Dataset Card for \"LatexCorrection\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"LatexCorrection\"\n\nMore Information needed"
] |
71f302844def4489c19e08e861d3af517ee0157e |
## Python Copilot Instructions on How to Code using Alpaca and Yaml
Training and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the **Agora Open Source AI Research Lab**:
- [Agora GitHub Organization](https://github.com/Agora-X)
- [Agora Hugging Face](https://huggingface.co/AgoraX)
This dataset is the 2024-02-03 update for the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 1182526
- Size: 2.1 GB
- Data type: instruct
- Format: Introduction on code usage using alpaca and yaml response
- Number of python repos: 1258
### How to use the datasets
#### Load Andromeda Train/Test
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-text-copilot-training-instruct-ai-research-2024-02-03", "andromeda", verification_mode="no_checks")
```
#### Load Swarms Train/Test
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-text-copilot-training-instruct-ai-research-2024-02-03", "swarms", verification_mode="no_checks")
```
#### Load Swarms Pytorch Train/Test
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-text-copilot-training-instruct-ai-research-2024-02-03", "swarms_pytorch", verification_mode="no_checks")
```
#### Load LongNet Train/Test
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-text-copilot-training-instruct-ai-research-2024-02-03", "longnet", verification_mode="no_checks")
```
# Load Zeta Train/Test
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-text-copilot-training-instruct-ai-research-2024-02-03", "zeta", verification_mode="no_checks")
```
### Schema
The instruction alpaca text with yaml response is in the **desc** column:
```json
{
"active": "bool",
"args": "string",
"args_len": "float64",
"audio_file": "string",
"audio_path": "string",
"class_bases": "string",
"class_name": "string",
"code": "string",
"code_len": "float64",
"desc": "string",
"desc_docstr": "string",
"desc_docstr_len": "float64",
"desc_len": "int64",
"docstr": "string",
"docstr_len": "int64",
"file_path": "string",
"file_type": "string",
"function_names": "string",
"gen_bytes": "int64",
"gen_data_type": "string",
"gen_mode": "string",
"gen_size": "int64",
"gen_valid": "bool",
"height": "int64",
"image_file": "string",
"image_path": "string",
"method_names": "string",
"name": "string",
"num_all_bases": "int64",
"num_bases": "int64",
"num_classes": "int64",
"num_functions": "float64",
"num_imports": "int64",
"num_methods": "float64",
"prompts": "string",
"raises": "string",
"raises_len": "float64",
"recsize": "int64",
"repo": "string",
"returns": "string",
"returns_len": "float64",
"size": "int64",
"src_object": "string",
"total_objects": "int64",
"usage": "string",
"usages": "string",
"width": "int64"
}
```
| matlok/python-text-copilot-training-instruct-ai-research-2024-02-03 | [
"task_categories:text-generation",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:1M<n<10M",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"coding",
"task",
"prompt",
"response",
"yaml",
"region:us"
] | 2024-02-03T18:39:09+00:00 | {"license": ["other"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "question-answering"], "task_ids": ["parsing"], "pretty_name": "2024-02-03 - python copilot instructions on how to code using alpaca and yaml", "dataset_info": [{"config_name": "andromeda", "splits": [{"name": "train"}, {"name": "test"}]}, {"config_name": "swarms", "splits": [{"name": "train"}, {"name": "test"}]}, {"config_name": "swarms_pytorch", "splits": [{"name": "train"}, {"name": "test"}]}, {"config_name": "longnet", "splits": [{"name": "train"}, {"name": "test"}]}, {"config_name": "zeta", "splits": [{"name": "train"}, {"name": "test"}]}], "configs": [{"config_name": "andromeda", "data_files": [{"split": "train", "path": "train/train-0001-andromeda-andromeda_torch.parquet"}, {"split": "test", "path": "test/train-0002-andromeda-tests.parquet"}]}, {"config_name": "swarms", "data_files": [{"split": "train", "path": "train/train-0004-swarms-swarms.parquet"}, {"split": "test", "path": "test/train-0005-swarms-tests.parquet"}]}, {"config_name": "swarms_pytorch", "data_files": [{"split": "train", "path": "train/train-0006-swarms-pytorch-swarms_torch.parquet"}, {"split": "test", "path": "test/train-0007-swarms-pytorch-tests.parquet"}]}, {"config_name": "longnet", "data_files": [{"split": "train", "path": "train/train-0009-longnet-long_net.parquet"}, {"split": "test", "path": "test/train-0010-longnet-tests.parquet"}]}, {"config_name": "zeta", "data_files": [{"split": "train", "path": "train/train-0011-zeta-zeta.parquet"}, {"split": "test", "path": "test/train-0012-zeta-tests.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "coding", "task", "prompt", "response", "yaml"]} | 2024-02-04T06:38:37+00:00 | [] | [] | TAGS
#task_categories-text-generation #task_categories-question-answering #task_ids-parsing #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #coding #task #prompt #response #yaml #region-us
|
## Python Copilot Instructions on How to Code using Alpaca and Yaml
Training and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the Agora Open Source AI Research Lab:
- Agora GitHub Organization
- Agora Hugging Face
This dataset is the 2024-02-03 update for the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 1182526
- Size: 2.1 GB
- Data type: instruct
- Format: Introduction on code usage using alpaca and yaml response
- Number of python repos: 1258
### How to use the datasets
#### Load Andromeda Train/Test
#### Load Swarms Train/Test
#### Load Swarms Pytorch Train/Test
#### Load LongNet Train/Test
# Load Zeta Train/Test
### Schema
The instruction alpaca text with yaml response is in the desc column:
| [
"## Python Copilot Instructions on How to Code using Alpaca and Yaml\n\nTraining and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the Agora Open Source AI Research Lab:\n\n- Agora GitHub Organization\n- Agora Hugging Face\n\nThis dataset is the 2024-02-03 update for the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 1182526\n- Size: 2.1 GB\n- Data type: instruct\n- Format: Introduction on code usage using alpaca and yaml response\n- Number of python repos: 1258",
"### How to use the datasets",
"#### Load Andromeda Train/Test",
"#### Load Swarms Train/Test",
"#### Load Swarms Pytorch Train/Test",
"#### Load LongNet Train/Test",
"# Load Zeta Train/Test",
"### Schema\n\nThe instruction alpaca text with yaml response is in the desc column:"
] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #task_ids-parsing #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #coding #task #prompt #response #yaml #region-us \n",
"## Python Copilot Instructions on How to Code using Alpaca and Yaml\n\nTraining and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the Agora Open Source AI Research Lab:\n\n- Agora GitHub Organization\n- Agora Hugging Face\n\nThis dataset is the 2024-02-03 update for the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 1182526\n- Size: 2.1 GB\n- Data type: instruct\n- Format: Introduction on code usage using alpaca and yaml response\n- Number of python repos: 1258",
"### How to use the datasets",
"#### Load Andromeda Train/Test",
"#### Load Swarms Train/Test",
"#### Load Swarms Pytorch Train/Test",
"#### Load LongNet Train/Test",
"# Load Zeta Train/Test",
"### Schema\n\nThe instruction alpaca text with yaml response is in the desc column:"
] |
579c2c11734223961c7f33994afe2fa2b7934ddf | ## Base information
AEZAKMI V3 is build on top of AEZAKMI V2 but there are many new samples.
I removed all coding samples plus those with "BEGINCONTEXT ENDCONTEXT References:" as they were bugging out the training with longer sequence len. \
I included filtered no_robots_sharegpt dataset, which makes this dataset non-commercial only! From no_robots, I removed stories, mentions of AI, coding etc. \
I added wsb dataset, based on Sentdex/wsb_reddit_v001, but I removed all samples shorter than 300 or 500 chars (i forgot now)
Finally, I removed all samples longer than 10000 chars from the dataset - my thinking is that those 13k-15k chars samples would have been given bigger weight during training assuming you have sample packing enabled.
They would have taken up more space in one particular sample and could introduce some errors similar to what I noticed with coding and BEGINCONTEXT "contextual" parts of airoboros. I don't want my model writing code or starting some weird context out of the blue, and those longer samples did that. | adamo1139/AEZAKMI_v3 | [
"license:other",
"region:us"
] | 2024-02-03T18:49:00+00:00 | {"license": "other", "license_name": "other", "license_link": "LICENSE"} | 2024-02-03T19:03:33+00:00 | [] | [] | TAGS
#license-other #region-us
| ## Base information
AEZAKMI V3 is build on top of AEZAKMI V2 but there are many new samples.
I removed all coding samples plus those with "BEGINCONTEXT ENDCONTEXT References:" as they were bugging out the training with longer sequence len. \
I included filtered no_robots_sharegpt dataset, which makes this dataset non-commercial only! From no_robots, I removed stories, mentions of AI, coding etc. \
I added wsb dataset, based on Sentdex/wsb_reddit_v001, but I removed all samples shorter than 300 or 500 chars (i forgot now)
Finally, I removed all samples longer than 10000 chars from the dataset - my thinking is that those 13k-15k chars samples would have been given bigger weight during training assuming you have sample packing enabled.
They would have taken up more space in one particular sample and could introduce some errors similar to what I noticed with coding and BEGINCONTEXT "contextual" parts of airoboros. I don't want my model writing code or starting some weird context out of the blue, and those longer samples did that. | [
"## Base information\n\nAEZAKMI V3 is build on top of AEZAKMI V2 but there are many new samples.\n\nI removed all coding samples plus those with \"BEGINCONTEXT ENDCONTEXT References:\" as they were bugging out the training with longer sequence len. \\\nI included filtered no_robots_sharegpt dataset, which makes this dataset non-commercial only! From no_robots, I removed stories, mentions of AI, coding etc. \\\nI added wsb dataset, based on Sentdex/wsb_reddit_v001, but I removed all samples shorter than 300 or 500 chars (i forgot now)\n\nFinally, I removed all samples longer than 10000 chars from the dataset - my thinking is that those 13k-15k chars samples would have been given bigger weight during training assuming you have sample packing enabled.\nThey would have taken up more space in one particular sample and could introduce some errors similar to what I noticed with coding and BEGINCONTEXT \"contextual\" parts of airoboros. I don't want my model writing code or starting some weird context out of the blue, and those longer samples did that."
] | [
"TAGS\n#license-other #region-us \n",
"## Base information\n\nAEZAKMI V3 is build on top of AEZAKMI V2 but there are many new samples.\n\nI removed all coding samples plus those with \"BEGINCONTEXT ENDCONTEXT References:\" as they were bugging out the training with longer sequence len. \\\nI included filtered no_robots_sharegpt dataset, which makes this dataset non-commercial only! From no_robots, I removed stories, mentions of AI, coding etc. \\\nI added wsb dataset, based on Sentdex/wsb_reddit_v001, but I removed all samples shorter than 300 or 500 chars (i forgot now)\n\nFinally, I removed all samples longer than 10000 chars from the dataset - my thinking is that those 13k-15k chars samples would have been given bigger weight during training assuming you have sample packing enabled.\nThey would have taken up more space in one particular sample and could introduce some errors similar to what I noticed with coding and BEGINCONTEXT \"contextual\" parts of airoboros. I don't want my model writing code or starting some weird context out of the blue, and those longer samples did that."
] |
859ac58396d712c54c86716b0835dd4922bf0029 |
# Dataset Card for Evaluation run of vikash06/doctorLLM5k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vikash06/doctorLLM5k](https://huggingface.co/vikash06/doctorLLM5k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vikash06__doctorLLM5k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T18:47:28.390342](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorLLM5k/blob/main/results_2024-02-03T18-47-28.390342.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44962394901525865,
"acc_stderr": 0.034433497653991056,
"acc_norm": 0.45409647084443916,
"acc_norm_stderr": 0.035204630647983674,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627908,
"mc2": 0.4313786428373932,
"mc2_stderr": 0.015714557783652643
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.014611305705056983,
"acc_norm": 0.5247440273037542,
"acc_norm_stderr": 0.014593487694937742
},
"harness|hellaswag|10": {
"acc": 0.6186018721370244,
"acc_stderr": 0.0048473726701346405,
"acc_norm": 0.7965544712208723,
"acc_norm_stderr": 0.004017383866405767
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4967741935483871,
"acc_stderr": 0.02844341422643833,
"acc_norm": 0.4967741935483871,
"acc_norm_stderr": 0.02844341422643833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.0333276906841079,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.0333276906841079
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4076923076923077,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.4076923076923077,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5926605504587156,
"acc_stderr": 0.021065986244412895,
"acc_norm": 0.5926605504587156,
"acc_norm_stderr": 0.021065986244412895
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03507793834791324,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03507793834791324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5864978902953587,
"acc_stderr": 0.03205649904851859,
"acc_norm": 0.5864978902953587,
"acc_norm_stderr": 0.03205649904851859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.033408675019233246,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.033408675019233246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.030572811310299607,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.030572811310299607
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.611749680715198,
"acc_stderr": 0.017427673295544323,
"acc_norm": 0.611749680715198,
"acc_norm_stderr": 0.017427673295544323
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.026897049996382868,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.026897049996382868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.01448750085285041,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.01448750085285041
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5691318327974276,
"acc_stderr": 0.028125340983972714,
"acc_norm": 0.5691318327974276,
"acc_norm_stderr": 0.028125340983972714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.02764012054516993,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.02764012054516993
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37027379400260757,
"acc_stderr": 0.01233293078125673,
"acc_norm": 0.37027379400260757,
"acc_norm_stderr": 0.01233293078125673
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.019944914136873576,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.019944914136873576
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40816326530612246,
"acc_stderr": 0.03146465712827423,
"acc_norm": 0.40816326530612246,
"acc_norm_stderr": 0.03146465712827423
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6268656716417911,
"acc_stderr": 0.034198326081760065,
"acc_norm": 0.6268656716417911,
"acc_norm_stderr": 0.034198326081760065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.03546976959393162,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.03546976959393162
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627908,
"mc2": 0.4313786428373932,
"mc2_stderr": 0.015714557783652643
},
"harness|winogrande|5": {
"acc": 0.6953433307024467,
"acc_stderr": 0.012935646499325307
},
"harness|gsm8k|5": {
"acc": 0.14101592115238817,
"acc_stderr": 0.009586695349244103
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vikash06__doctorLLM5k | [
"region:us"
] | 2024-02-03T18:49:54+00:00 | {"pretty_name": "Evaluation run of vikash06/doctorLLM5k", "dataset_summary": "Dataset automatically created during the evaluation run of model [vikash06/doctorLLM5k](https://huggingface.co/vikash06/doctorLLM5k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vikash06__doctorLLM5k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T18:47:28.390342](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorLLM5k/blob/main/results_2024-02-03T18-47-28.390342.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44962394901525865,\n \"acc_stderr\": 0.034433497653991056,\n \"acc_norm\": 0.45409647084443916,\n \"acc_norm_stderr\": 0.035204630647983674,\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.4313786428373932,\n \"mc2_stderr\": 0.015714557783652643\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.014611305705056983,\n \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.014593487694937742\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6186018721370244,\n \"acc_stderr\": 0.0048473726701346405,\n \"acc_norm\": 0.7965544712208723,\n \"acc_norm_stderr\": 0.004017383866405767\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.030561590426731837,\n \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.030561590426731837\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4967741935483871,\n \"acc_stderr\": 0.02844341422643833,\n \"acc_norm\": 0.4967741935483871,\n \"acc_norm_stderr\": 0.02844341422643833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4898989898989899,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4076923076923077,\n \"acc_stderr\": 0.024915243985987847,\n \"acc_norm\": 0.4076923076923077,\n \"acc_norm_stderr\": 0.024915243985987847\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5926605504587156,\n \"acc_stderr\": 0.021065986244412895,\n \"acc_norm\": 0.5926605504587156,\n \"acc_norm_stderr\": 0.021065986244412895\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03507793834791324,\n \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03507793834791324\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5864978902953587,\n \"acc_stderr\": 0.03205649904851859,\n \"acc_norm\": 0.5864978902953587,\n \"acc_norm_stderr\": 0.03205649904851859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n \"acc_stderr\": 0.033408675019233246,\n \"acc_norm\": 0.547085201793722,\n \"acc_norm_stderr\": 0.033408675019233246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.043851623256015534,\n \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.043851623256015534\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.030572811310299607,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.030572811310299607\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.611749680715198,\n \"acc_stderr\": 0.017427673295544323,\n \"acc_norm\": 0.611749680715198,\n \"acc_norm_stderr\": 0.017427673295544323\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.026897049996382868,\n \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.026897049996382868\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.01448750085285041,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.01448750085285041\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.028452639985088006,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.028452639985088006\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.5691318327974276,\n \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.02778680093142745,\n \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.02778680093142745\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3120567375886525,\n \"acc_stderr\": 0.02764012054516993,\n \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.02764012054516993\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37027379400260757,\n \"acc_stderr\": 0.01233293078125673,\n \"acc_norm\": 0.37027379400260757,\n \"acc_norm_stderr\": 0.01233293078125673\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.019944914136873576,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.019944914136873576\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.40816326530612246,\n \"acc_stderr\": 0.03146465712827423,\n \"acc_norm\": 0.40816326530612246,\n \"acc_norm_stderr\": 0.03146465712827423\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6268656716417911,\n \"acc_stderr\": 0.034198326081760065,\n \"acc_norm\": 0.6268656716417911,\n \"acc_norm_stderr\": 0.034198326081760065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.03546976959393162,\n \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.03546976959393162\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.4313786428373932,\n \"mc2_stderr\": 0.015714557783652643\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6953433307024467,\n \"acc_stderr\": 0.012935646499325307\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14101592115238817,\n \"acc_stderr\": 0.009586695349244103\n }\n}\n```", "repo_url": "https://huggingface.co/vikash06/doctorLLM5k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|arc:challenge|25_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|gsm8k|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hellaswag|10_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T18-47-28.390342.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["**/details_harness|winogrande|5_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T18-47-28.390342.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T18_47_28.390342", "path": ["results_2024-02-03T18-47-28.390342.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T18-47-28.390342.parquet"]}]}]} | 2024-02-03T18:50:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vikash06/doctorLLM5k
Dataset automatically created during the evaluation run of model vikash06/doctorLLM5k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T18:47:28.390342(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vikash06/doctorLLM5k\n\n\n\nDataset automatically created during the evaluation run of model vikash06/doctorLLM5k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T18:47:28.390342(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vikash06/doctorLLM5k\n\n\n\nDataset automatically created during the evaluation run of model vikash06/doctorLLM5k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T18:47:28.390342(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
29b6c757974038991b8d1d2ae69394ebbc411b6d |
# Dataset Card for Evaluation run of Locutusque/Hercules-2.0-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Hercules-2.0-Mistral-7B](https://huggingface.co/Locutusque/Hercules-2.0-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T13:12:07.013905](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B/blob/main/results_2024-02-09T13-12-07.013905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6332465979918371,
"acc_stderr": 0.03235955493460707,
"acc_norm": 0.6377302097946538,
"acc_norm_stderr": 0.03300999270530235,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.4396723008156011,
"mc2_stderr": 0.014161167393006498
},
"harness|arc:challenge|25": {
"acc": 0.5793515358361775,
"acc_stderr": 0.014426211252508397,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6313483369846644,
"acc_stderr": 0.004814532642574651,
"acc_norm": 0.836885082652858,
"acc_norm_stderr": 0.003687153940568797
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629454,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851112,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887044,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887044
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684805,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.030047357655806635,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.030047357655806635
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899126,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899126
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247326,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247326
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.01526867731760228,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.01526867731760228
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580217,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580217
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.4396723008156011,
"mc2_stderr": 0.014161167393006498
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.01135031570746206
},
"harness|gsm8k|5": {
"acc": 0.444275966641395,
"acc_stderr": 0.013686685712261669
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B | [
"region:us"
] | 2024-02-03T19:23:59+00:00 | {"pretty_name": "Evaluation run of Locutusque/Hercules-2.0-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Locutusque/Hercules-2.0-Mistral-7B](https://huggingface.co/Locutusque/Hercules-2.0-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-09T13:12:07.013905](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B/blob/main/results_2024-02-09T13-12-07.013905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6332465979918371,\n \"acc_stderr\": 0.03235955493460707,\n \"acc_norm\": 0.6377302097946538,\n \"acc_norm_stderr\": 0.03300999270530235,\n \"mc1\": 0.28886168910648713,\n \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.4396723008156011,\n \"mc2_stderr\": 0.014161167393006498\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.014426211252508397,\n \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6313483369846644,\n \"acc_stderr\": 0.004814532642574651,\n \"acc_norm\": 0.836885082652858,\n \"acc_norm_stderr\": 0.003687153940568797\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629454,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851112,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851112\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887044,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887044\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684805,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684805\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069422,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069422\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.030047357655806635,\n \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.030047357655806635\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899126,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899126\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247326,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247326\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n \"acc_stderr\": 0.01526867731760228,\n \"acc_norm\": 0.29608938547486036,\n \"acc_norm_stderr\": 0.01526867731760228\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n \"acc_stderr\": 0.012676014778580217,\n \"acc_norm\": 0.439374185136897,\n \"acc_norm_stderr\": 0.012676014778580217\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389844,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389844\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.4396723008156011,\n \"mc2_stderr\": 0.014161167393006498\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.01135031570746206\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.444275966641395,\n \"acc_stderr\": 0.013686685712261669\n }\n}\n```", "repo_url": "https://huggingface.co/Locutusque/Hercules-2.0-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|arc:challenge|25_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|arc:challenge|25_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|gsm8k|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|gsm8k|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hellaswag|10_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hellaswag|10_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T19-21-33.913590.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-09T13-12-07.013905.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["**/details_harness|winogrande|5_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["**/details_harness|winogrande|5_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-09T13-12-07.013905.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T19_21_33.913590", "path": ["results_2024-02-03T19-21-33.913590.parquet"]}, {"split": "2024_02_09T13_12_07.013905", "path": ["results_2024-02-09T13-12-07.013905.parquet"]}, {"split": "latest", "path": ["results_2024-02-09T13-12-07.013905.parquet"]}]}]} | 2024-02-09T13:14:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Locutusque/Hercules-2.0-Mistral-7B
Dataset automatically created during the evaluation run of model Locutusque/Hercules-2.0-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-09T13:12:07.013905(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Locutusque/Hercules-2.0-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/Hercules-2.0-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-09T13:12:07.013905(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Locutusque/Hercules-2.0-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/Hercules-2.0-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-09T13:12:07.013905(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0f2cb8a57c3a730bb728b2309218f5af7a3ceb5f |
# Dataset Card for Finnish-NLP/ai2arc-deepl-translated-sft
## Creation process
- Load data from allenai/ai2_arc translated with deepl
- Do zero shot classification with facebook/bart-large-mnli with the following prompt:
```python
preds = pipe(f'{row["input"]} is a question about:', candidate_labels=["USA related question", "Math related question", "General question", "Coding related question"])
```
- Filter out rows with too high scores in following categories ["USA related question", "Math related question","Coding related question"]
- Write rows to .txt file with *** on a newline separating instruction/response and then END on a newline separating samples
- Upload file to deepl.com for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity | Finnish-NLP/ai2arc-deepl-translated-sft | [
"task_categories:text-generation",
"language:fi",
"license:cc-by-sa-4.0",
"SFT",
"region:us"
] | 2024-02-03T20:07:30+00:00 | {"language": ["fi"], "license": "cc-by-sa-4.0", "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 131141, "num_examples": 410}], "download_size": 78634, "dataset_size": 131141}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["SFT"]} | 2024-02-13T21:31:18+00:00 | [] | [
"fi"
] | TAGS
#task_categories-text-generation #language-Finnish #license-cc-by-sa-4.0 #SFT #region-us
|
# Dataset Card for Finnish-NLP/ai2arc-deepl-translated-sft
## Creation process
- Load data from allenai/ai2_arc translated with deepl
- Do zero shot classification with facebook/bart-large-mnli with the following prompt:
- Filter out rows with too high scores in following categories ["USA related question", "Math related question","Coding related question"]
- Write rows to .txt file with * on a newline separating instruction/response and then END on a newline separating samples
- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity | [
"# Dataset Card for Finnish-NLP/ai2arc-deepl-translated-sft",
"## Creation process\n - Load data from allenai/ai2_arc translated with deepl\n - Do zero shot classification with facebook/bart-large-mnli with the following prompt:\n\n- Filter out rows with too high scores in following categories [\"USA related question\", \"Math related question\",\"Coding related question\"]\n- Write rows to .txt file with * on a newline separating instruction/response and then END on a newline separating samples\n- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity"
] | [
"TAGS\n#task_categories-text-generation #language-Finnish #license-cc-by-sa-4.0 #SFT #region-us \n",
"# Dataset Card for Finnish-NLP/ai2arc-deepl-translated-sft",
"## Creation process\n - Load data from allenai/ai2_arc translated with deepl\n - Do zero shot classification with facebook/bart-large-mnli with the following prompt:\n\n- Filter out rows with too high scores in following categories [\"USA related question\", \"Math related question\",\"Coding related question\"]\n- Write rows to .txt file with * on a newline separating instruction/response and then END on a newline separating samples\n- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity"
] |
24f9919940754e976f2a8f7d17474322a7a5e3c6 | license: unknown
---
| MarkrAI/eli5_sample_autorag | [
"region:us"
] | 2024-02-03T20:24:24+00:00 | {"configs": [{"config_name": "qa", "splits": [{"name": "train", "data_files": "qa_train.parquet"}, {"name": "test", "data_files": "qa_test.parquet"}]}, {"config_name": "corpus", "data_files": "corpus.parquet"}]} | 2024-02-05T14:40:06+00:00 | [] | [] | TAGS
#region-us
| license: unknown
---
| [] | [
"TAGS\n#region-us \n"
] |
52af0c43bc6bf8f69fc7ce3bce913c00437b7f5b |
# Dataset Card for Finnish-NLP/boolq-deepl-translated-sft
## Creation process
- Load data from google/boolq translated with deepl
- Write rows to .txt file with *** on a newline separating instruction/response and then END on a newline separating samples
- Upload file to deepl.com for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity | Finnish-NLP/boolq-deepl-translated-sft | [
"task_categories:text-generation",
"language:fi",
"license:cc-by-sa-3.0",
"region:us"
] | 2024-02-03T20:50:12+00:00 | {"language": ["fi"], "license": "cc-by-sa-3.0", "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "rank", "dtype": "float64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1639790, "num_examples": 2118}], "download_size": 964664, "dataset_size": 1639790}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-13T21:30:19+00:00 | [] | [
"fi"
] | TAGS
#task_categories-text-generation #language-Finnish #license-cc-by-sa-3.0 #region-us
|
# Dataset Card for Finnish-NLP/boolq-deepl-translated-sft
## Creation process
- Load data from google/boolq translated with deepl
- Write rows to .txt file with * on a newline separating instruction/response and then END on a newline separating samples
- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity | [
"# Dataset Card for Finnish-NLP/boolq-deepl-translated-sft",
"## Creation process\n- Load data from google/boolq translated with deepl\n- Write rows to .txt file with * on a newline separating instruction/response and then END on a newline separating samples\n- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity"
] | [
"TAGS\n#task_categories-text-generation #language-Finnish #license-cc-by-sa-3.0 #region-us \n",
"# Dataset Card for Finnish-NLP/boolq-deepl-translated-sft",
"## Creation process\n- Load data from google/boolq translated with deepl\n- Write rows to .txt file with * on a newline separating instruction/response and then END on a newline separating samples\n- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity"
] |
76c472c0298f088c8135c2532f916ce7c830320e |
# Dataset Card for Finnish-NLP/Capybara-deepl-translated-sft
## Creation process
- Load data from LDJnr/Capybara
- Filter only samples that contain one input/output pair
- Do zero shot classification with facebook/bart-large-mnli with the following prompt:
```python
preds = pipe(f'{row["input"]} is a question about:', candidate_labels=["USA related question", "Math related question", "General question", "Coding related question"])
```
- Filter out rows with too high scores in following categories ["USA related question", "Math related question","Coding related question"]
- Write rows to .txt file with *** on a newline separating instruction/response and then END on a newline separating samples
- Upload file to deepl.com for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity | Finnish-NLP/Capybara-fi-deepl-translated-sft | [
"task_categories:text-generation",
"language:fi",
"license:apache-2.0",
"SFT",
"region:us"
] | 2024-02-03T21:35:22+00:00 | {"language": ["fi"], "license": "apache-2.0", "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1881983, "num_examples": 1376}], "download_size": 1157117, "dataset_size": 1881983}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["SFT"]} | 2024-02-13T21:34:02+00:00 | [] | [
"fi"
] | TAGS
#task_categories-text-generation #language-Finnish #license-apache-2.0 #SFT #region-us
|
# Dataset Card for Finnish-NLP/Capybara-deepl-translated-sft
## Creation process
- Load data from LDJnr/Capybara
- Filter only samples that contain one input/output pair
- Do zero shot classification with facebook/bart-large-mnli with the following prompt:
- Filter out rows with too high scores in following categories ["USA related question", "Math related question","Coding related question"]
- Write rows to .txt file with * on a newline separating instruction/response and then END on a newline separating samples
- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity | [
"# Dataset Card for Finnish-NLP/Capybara-deepl-translated-sft",
"## Creation process\n - Load data from LDJnr/Capybara\n - Filter only samples that contain one input/output pair\n - Do zero shot classification with facebook/bart-large-mnli with the following prompt:\n\n- Filter out rows with too high scores in following categories [\"USA related question\", \"Math related question\",\"Coding related question\"]\n- Write rows to .txt file with * on a newline separating instruction/response and then END on a newline separating samples\n- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity"
] | [
"TAGS\n#task_categories-text-generation #language-Finnish #license-apache-2.0 #SFT #region-us \n",
"# Dataset Card for Finnish-NLP/Capybara-deepl-translated-sft",
"## Creation process\n - Load data from LDJnr/Capybara\n - Filter only samples that contain one input/output pair\n - Do zero shot classification with facebook/bart-large-mnli with the following prompt:\n\n- Filter out rows with too high scores in following categories [\"USA related question\", \"Math related question\",\"Coding related question\"]\n- Write rows to .txt file with * on a newline separating instruction/response and then END on a newline separating samples\n- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity"
] |
92f83d5f8c6145ed37fc46b0942a349f9f10ed0e |
# Dataset Card for Evaluation run of AA051611/V0202
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/V0202](https://huggingface.co/AA051611/V0202) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__V0202",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T21:33:44.363250](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__V0202/blob/main/results_2024-02-03T21-33-44.363250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.8475886247356212,
"acc_stderr": 0.023609522686145943,
"acc_norm": 0.8592262318029122,
"acc_norm_stderr": 0.023958294301700357,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965824,
"mc2": 0.5088923290302036,
"mc2_stderr": 0.015447986277853607
},
"harness|arc:challenge|25": {
"acc": 0.6348122866894198,
"acc_stderr": 0.0140702655192688,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441375
},
"harness|hellaswag|10": {
"acc": 0.6243776140211114,
"acc_stderr": 0.004832934529120793,
"acc_norm": 0.8275243975303724,
"acc_norm_stderr": 0.003770211859118937
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.8074074074074075,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.8074074074074075,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9407894736842105,
"acc_stderr": 0.01920689719680031,
"acc_norm": 0.9407894736842105,
"acc_norm_stderr": 0.01920689719680031
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197772,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197772
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.879245283018868,
"acc_stderr": 0.020054189400972373,
"acc_norm": 0.879245283018868,
"acc_norm_stderr": 0.020054189400972373
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9583333333333334,
"acc_stderr": 0.01671031580295999,
"acc_norm": 0.9583333333333334,
"acc_norm_stderr": 0.01671031580295999
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.028083594279575755,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.028083594279575755
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8893617021276595,
"acc_stderr": 0.020506145099008426,
"acc_norm": 0.8893617021276595,
"acc_norm_stderr": 0.020506145099008426
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.7982456140350878,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.7982456140350878,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8413793103448276,
"acc_stderr": 0.030443500317583975,
"acc_norm": 0.8413793103448276,
"acc_norm_stderr": 0.030443500317583975
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.018296139984289767,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.018296139984289767
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9548387096774194,
"acc_stderr": 0.01181323762156236,
"acc_norm": 0.9548387096774194,
"acc_norm_stderr": 0.01181323762156236
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.7881773399014779,
"acc_stderr": 0.028748983689941072,
"acc_norm": 0.7881773399014779,
"acc_norm_stderr": 0.028748983689941072
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646612,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646612
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9333333333333333,
"acc_stderr": 0.019478290326359282,
"acc_norm": 0.9333333333333333,
"acc_norm_stderr": 0.019478290326359282
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9646464646464646,
"acc_stderr": 0.013157318878046073,
"acc_norm": 0.9646464646464646,
"acc_norm_stderr": 0.013157318878046073
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909013,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909013
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.014491348171728305,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.014491348171728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.7666666666666667,
"acc_stderr": 0.025787874220959302,
"acc_norm": 0.7666666666666667,
"acc_norm_stderr": 0.025787874220959302
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8949579831932774,
"acc_stderr": 0.019916300758805225,
"acc_norm": 0.8949579831932774,
"acc_norm_stderr": 0.019916300758805225
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.6887417218543046,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.6887417218543046,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9467889908256881,
"acc_stderr": 0.009623385815462397,
"acc_norm": 0.9467889908256881,
"acc_norm_stderr": 0.009623385815462397
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.026491914727355164,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.026491914727355164
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9558823529411765,
"acc_stderr": 0.014413198705704825,
"acc_norm": 0.9558823529411765,
"acc_norm_stderr": 0.014413198705704825
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9409282700421941,
"acc_stderr": 0.015346597463888693,
"acc_norm": 0.9409282700421941,
"acc_norm_stderr": 0.015346597463888693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.9147982062780269,
"acc_stderr": 0.01873745202573731,
"acc_norm": 0.9147982062780269,
"acc_norm_stderr": 0.01873745202573731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9236641221374046,
"acc_stderr": 0.02328893953617375,
"acc_norm": 0.9236641221374046,
"acc_norm_stderr": 0.02328893953617375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9338842975206612,
"acc_stderr": 0.02268340369172331,
"acc_norm": 0.9338842975206612,
"acc_norm_stderr": 0.02268340369172331
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9629629629629629,
"acc_stderr": 0.018257067489429676,
"acc_norm": 0.9629629629629629,
"acc_norm_stderr": 0.018257067489429676
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9141104294478528,
"acc_stderr": 0.022014662933817535,
"acc_norm": 0.9141104294478528,
"acc_norm_stderr": 0.022014662933817535
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.8482142857142857,
"acc_stderr": 0.03405702838185695,
"acc_norm": 0.8482142857142857,
"acc_norm_stderr": 0.03405702838185695
},
"harness|hendrycksTest-management|5": {
"acc": 0.9223300970873787,
"acc_stderr": 0.02650144078476276,
"acc_norm": 0.9223300970873787,
"acc_norm_stderr": 0.02650144078476276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9572649572649573,
"acc_stderr": 0.013250436685245011,
"acc_norm": 0.9572649572649573,
"acc_norm_stderr": 0.013250436685245011
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776348,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776348
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9527458492975734,
"acc_stderr": 0.007587612392626577,
"acc_norm": 0.9527458492975734,
"acc_norm_stderr": 0.007587612392626577
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8554913294797688,
"acc_stderr": 0.018929764513468728,
"acc_norm": 0.8554913294797688,
"acc_norm_stderr": 0.018929764513468728
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8804469273743016,
"acc_stderr": 0.010850836082151255,
"acc_norm": 0.8804469273743016,
"acc_norm_stderr": 0.010850836082151255
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.9052287581699346,
"acc_stderr": 0.01677133127183646,
"acc_norm": 0.9052287581699346,
"acc_norm_stderr": 0.01677133127183646
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8938906752411575,
"acc_stderr": 0.017491946161301987,
"acc_norm": 0.8938906752411575,
"acc_norm_stderr": 0.017491946161301987
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.9104938271604939,
"acc_stderr": 0.01588414107393756,
"acc_norm": 0.9104938271604939,
"acc_norm_stderr": 0.01588414107393756
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.75177304964539,
"acc_stderr": 0.0257700156442904,
"acc_norm": 0.75177304964539,
"acc_norm_stderr": 0.0257700156442904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.8292046936114733,
"acc_stderr": 0.009611645934807811,
"acc_norm": 0.8292046936114733,
"acc_norm_stderr": 0.009611645934807811
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01722970778103902,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01722970778103902
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8872549019607843,
"acc_stderr": 0.012795357747288056,
"acc_norm": 0.8872549019607843,
"acc_norm_stderr": 0.012795357747288056
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8454545454545455,
"acc_stderr": 0.03462262571262667,
"acc_norm": 0.8454545454545455,
"acc_norm_stderr": 0.03462262571262667
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8775510204081632,
"acc_stderr": 0.020985477705882164,
"acc_norm": 0.8775510204081632,
"acc_norm_stderr": 0.020985477705882164
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.945273631840796,
"acc_stderr": 0.016082815796263267,
"acc_norm": 0.945273631840796,
"acc_norm_stderr": 0.016082815796263267
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759033,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759033
},
"harness|hendrycksTest-virology|5": {
"acc": 0.6867469879518072,
"acc_stderr": 0.03610805018031024,
"acc_norm": 0.6867469879518072,
"acc_norm_stderr": 0.03610805018031024
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9181286549707602,
"acc_stderr": 0.02102777265656387,
"acc_norm": 0.9181286549707602,
"acc_norm_stderr": 0.02102777265656387
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965824,
"mc2": 0.5088923290302036,
"mc2_stderr": 0.015447986277853607
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140935
},
"harness|gsm8k|5": {
"acc": 0.4586808188021228,
"acc_stderr": 0.0137253773263428
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051611__V0202 | [
"region:us"
] | 2024-02-03T21:35:56+00:00 | {"pretty_name": "Evaluation run of AA051611/V0202", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051611/V0202](https://huggingface.co/AA051611/V0202) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__V0202\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T21:33:44.363250](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__V0202/blob/main/results_2024-02-03T21-33-44.363250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8475886247356212,\n \"acc_stderr\": 0.023609522686145943,\n \"acc_norm\": 0.8592262318029122,\n \"acc_norm_stderr\": 0.023958294301700357,\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965824,\n \"mc2\": 0.5088923290302036,\n \"mc2_stderr\": 0.015447986277853607\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.0140702655192688,\n \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441375\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6243776140211114,\n \"acc_stderr\": 0.004832934529120793,\n \"acc_norm\": 0.8275243975303724,\n \"acc_norm_stderr\": 0.003770211859118937\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8074074074074075,\n \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.8074074074074075,\n \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9407894736842105,\n \"acc_stderr\": 0.01920689719680031,\n \"acc_norm\": 0.9407894736842105,\n \"acc_norm_stderr\": 0.01920689719680031\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197772,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197772\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.879245283018868,\n \"acc_stderr\": 0.020054189400972373,\n \"acc_norm\": 0.879245283018868,\n \"acc_norm_stderr\": 0.020054189400972373\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9583333333333334,\n \"acc_stderr\": 0.01671031580295999,\n \"acc_norm\": 0.9583333333333334,\n \"acc_norm_stderr\": 0.01671031580295999\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.028083594279575755,\n \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.028083594279575755\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8893617021276595,\n \"acc_stderr\": 0.020506145099008426,\n \"acc_norm\": 0.8893617021276595,\n \"acc_norm_stderr\": 0.020506145099008426\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7982456140350878,\n \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.7982456140350878,\n \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8413793103448276,\n \"acc_stderr\": 0.030443500317583975,\n \"acc_norm\": 0.8413793103448276,\n \"acc_norm_stderr\": 0.030443500317583975\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.018296139984289767,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.018296139984289767\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9548387096774194,\n \"acc_stderr\": 0.01181323762156236,\n \"acc_norm\": 0.9548387096774194,\n \"acc_norm_stderr\": 0.01181323762156236\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.7881773399014779,\n \"acc_stderr\": 0.028748983689941072,\n \"acc_norm\": 0.7881773399014779,\n \"acc_norm_stderr\": 0.028748983689941072\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646612,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646612\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.9333333333333333,\n \"acc_stderr\": 0.019478290326359282,\n \"acc_norm\": 0.9333333333333333,\n \"acc_norm_stderr\": 0.019478290326359282\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9646464646464646,\n \"acc_stderr\": 0.013157318878046073,\n \"acc_norm\": 0.9646464646464646,\n \"acc_norm_stderr\": 0.013157318878046073\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909013,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909013\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.9102564102564102,\n \"acc_stderr\": 0.014491348171728305,\n \"acc_norm\": 0.9102564102564102,\n \"acc_norm_stderr\": 0.014491348171728305\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.7666666666666667,\n \"acc_stderr\": 0.025787874220959302,\n \"acc_norm\": 0.7666666666666667,\n \"acc_norm_stderr\": 0.025787874220959302\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8949579831932774,\n \"acc_stderr\": 0.019916300758805225,\n \"acc_norm\": 0.8949579831932774,\n \"acc_norm_stderr\": 0.019916300758805225\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.6887417218543046,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.6887417218543046,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9467889908256881,\n \"acc_stderr\": 0.009623385815462397,\n \"acc_norm\": 0.9467889908256881,\n \"acc_norm_stderr\": 0.009623385815462397\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.026491914727355164,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.026491914727355164\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9558823529411765,\n \"acc_stderr\": 0.014413198705704825,\n \"acc_norm\": 0.9558823529411765,\n \"acc_norm_stderr\": 0.014413198705704825\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9409282700421941,\n \"acc_stderr\": 0.015346597463888693,\n \"acc_norm\": 0.9409282700421941,\n \"acc_norm_stderr\": 0.015346597463888693\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.9147982062780269,\n \"acc_stderr\": 0.01873745202573731,\n \"acc_norm\": 0.9147982062780269,\n \"acc_norm_stderr\": 0.01873745202573731\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9236641221374046,\n \"acc_stderr\": 0.02328893953617375,\n \"acc_norm\": 0.9236641221374046,\n \"acc_norm_stderr\": 0.02328893953617375\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9338842975206612,\n \"acc_stderr\": 0.02268340369172331,\n \"acc_norm\": 0.9338842975206612,\n \"acc_norm_stderr\": 0.02268340369172331\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9629629629629629,\n \"acc_stderr\": 0.018257067489429676,\n \"acc_norm\": 0.9629629629629629,\n \"acc_norm_stderr\": 0.018257067489429676\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.9141104294478528,\n \"acc_stderr\": 0.022014662933817535,\n \"acc_norm\": 0.9141104294478528,\n \"acc_norm_stderr\": 0.022014662933817535\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.8482142857142857,\n \"acc_stderr\": 0.03405702838185695,\n \"acc_norm\": 0.8482142857142857,\n \"acc_norm_stderr\": 0.03405702838185695\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9223300970873787,\n \"acc_stderr\": 0.02650144078476276,\n \"acc_norm\": 0.9223300970873787,\n \"acc_norm_stderr\": 0.02650144078476276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9572649572649573,\n \"acc_stderr\": 0.013250436685245011,\n \"acc_norm\": 0.9572649572649573,\n \"acc_norm_stderr\": 0.013250436685245011\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9527458492975734,\n \"acc_stderr\": 0.007587612392626577,\n \"acc_norm\": 0.9527458492975734,\n \"acc_norm_stderr\": 0.007587612392626577\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8554913294797688,\n \"acc_stderr\": 0.018929764513468728,\n \"acc_norm\": 0.8554913294797688,\n \"acc_norm_stderr\": 0.018929764513468728\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8804469273743016,\n \"acc_stderr\": 0.010850836082151255,\n \"acc_norm\": 0.8804469273743016,\n \"acc_norm_stderr\": 0.010850836082151255\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.9052287581699346,\n \"acc_stderr\": 0.01677133127183646,\n \"acc_norm\": 0.9052287581699346,\n \"acc_norm_stderr\": 0.01677133127183646\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8938906752411575,\n \"acc_stderr\": 0.017491946161301987,\n \"acc_norm\": 0.8938906752411575,\n \"acc_norm_stderr\": 0.017491946161301987\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.9104938271604939,\n \"acc_stderr\": 0.01588414107393756,\n \"acc_norm\": 0.9104938271604939,\n \"acc_norm_stderr\": 0.01588414107393756\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.75177304964539,\n \"acc_stderr\": 0.0257700156442904,\n \"acc_norm\": 0.75177304964539,\n \"acc_norm_stderr\": 0.0257700156442904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.8292046936114733,\n \"acc_stderr\": 0.009611645934807811,\n \"acc_norm\": 0.8292046936114733,\n \"acc_norm_stderr\": 0.009611645934807811\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.01722970778103902,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01722970778103902\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8872549019607843,\n \"acc_stderr\": 0.012795357747288056,\n \"acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.012795357747288056\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8454545454545455,\n \"acc_stderr\": 0.03462262571262667,\n \"acc_norm\": 0.8454545454545455,\n \"acc_norm_stderr\": 0.03462262571262667\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8775510204081632,\n \"acc_stderr\": 0.020985477705882164,\n \"acc_norm\": 0.8775510204081632,\n \"acc_norm_stderr\": 0.020985477705882164\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.945273631840796,\n \"acc_stderr\": 0.016082815796263267,\n \"acc_norm\": 0.945273631840796,\n \"acc_norm_stderr\": 0.016082815796263267\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759033,\n \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759033\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6867469879518072,\n \"acc_stderr\": 0.03610805018031024,\n \"acc_norm\": 0.6867469879518072,\n \"acc_norm_stderr\": 0.03610805018031024\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9181286549707602,\n \"acc_stderr\": 0.02102777265656387,\n \"acc_norm\": 0.9181286549707602,\n \"acc_norm_stderr\": 0.02102777265656387\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965824,\n \"mc2\": 0.5088923290302036,\n \"mc2_stderr\": 0.015447986277853607\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4586808188021228,\n \"acc_stderr\": 0.0137253773263428\n }\n}\n```", "repo_url": "https://huggingface.co/AA051611/V0202", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|arc:challenge|25_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|gsm8k|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hellaswag|10_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T21-33-44.363250.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["**/details_harness|winogrande|5_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T21-33-44.363250.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T21_33_44.363250", "path": ["results_2024-02-03T21-33-44.363250.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T21-33-44.363250.parquet"]}]}]} | 2024-02-03T21:36:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051611/V0202
Dataset automatically created during the evaluation run of model AA051611/V0202 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T21:33:44.363250(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051611/V0202\n\n\n\nDataset automatically created during the evaluation run of model AA051611/V0202 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T21:33:44.363250(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051611/V0202\n\n\n\nDataset automatically created during the evaluation run of model AA051611/V0202 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T21:33:44.363250(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
eb2d95851a4cee3a86bd8f59077eca96f0feb151 |
# Dataset Card for Evaluation run of rizla/trrapi-16b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rizla/trrapi-16b](https://huggingface.co/rizla/trrapi-16b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rizla__trrapi-16b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T21:35:54.885186](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__trrapi-16b/blob/main/results_2024-02-03T21-35-54.885186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6475033667590542,
"acc_stderr": 0.032124967055002625,
"acc_norm": 0.648064835420934,
"acc_norm_stderr": 0.03279129587010941,
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272168,
"mc2": 0.7413221252292123,
"mc2_stderr": 0.014409709803356395
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068079,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.013106784883601336
},
"harness|hellaswag|10": {
"acc": 0.7129057956582354,
"acc_stderr": 0.004514813363221144,
"acc_norm": 0.8887671778530173,
"acc_norm_stderr": 0.0031377764442772
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512625,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512625
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229091,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229091
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.016421670506339185,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.016421670506339185
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038911,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038911
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272168,
"mc2": 0.7413221252292123,
"mc2_stderr": 0.014409709803356395
},
"harness|winogrande|5": {
"acc": 0.8634569850039463,
"acc_stderr": 0.009650242900291614
},
"harness|gsm8k|5": {
"acc": 0.6118271417740713,
"acc_stderr": 0.013423607564002755
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_rizla__trrapi-16b | [
"region:us"
] | 2024-02-03T21:38:13+00:00 | {"pretty_name": "Evaluation run of rizla/trrapi-16b", "dataset_summary": "Dataset automatically created during the evaluation run of model [rizla/trrapi-16b](https://huggingface.co/rizla/trrapi-16b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rizla__trrapi-16b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T21:35:54.885186](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__trrapi-16b/blob/main/results_2024-02-03T21-35-54.885186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6475033667590542,\n \"acc_stderr\": 0.032124967055002625,\n \"acc_norm\": 0.648064835420934,\n \"acc_norm_stderr\": 0.03279129587010941,\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272168,\n \"mc2\": 0.7413221252292123,\n \"mc2_stderr\": 0.014409709803356395\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.013106784883601336\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7129057956582354,\n \"acc_stderr\": 0.004514813363221144,\n \"acc_norm\": 0.8887671778530173,\n \"acc_norm_stderr\": 0.0031377764442772\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229091,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229091\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n \"acc_stderr\": 0.016421670506339185,\n \"acc_norm\": 0.40558659217877097,\n \"acc_norm_stderr\": 0.016421670506339185\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.012758410941038911,\n \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.012758410941038911\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272168,\n \"mc2\": 0.7413221252292123,\n \"mc2_stderr\": 0.014409709803356395\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8634569850039463,\n \"acc_stderr\": 0.009650242900291614\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6118271417740713,\n \"acc_stderr\": 0.013423607564002755\n }\n}\n```", "repo_url": "https://huggingface.co/rizla/trrapi-16b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|arc:challenge|25_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|gsm8k|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hellaswag|10_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T21-35-54.885186.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["**/details_harness|winogrande|5_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T21-35-54.885186.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T21_35_54.885186", "path": ["results_2024-02-03T21-35-54.885186.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T21-35-54.885186.parquet"]}]}]} | 2024-02-03T21:38:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rizla/trrapi-16b
Dataset automatically created during the evaluation run of model rizla/trrapi-16b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T21:35:54.885186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of rizla/trrapi-16b\n\n\n\nDataset automatically created during the evaluation run of model rizla/trrapi-16b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T21:35:54.885186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rizla/trrapi-16b\n\n\n\nDataset automatically created during the evaluation run of model rizla/trrapi-16b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T21:35:54.885186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
57cdfe9e25b96310d3fbb44a012e3bc9bbc5604c | license: unknown
---
| MarkrAI/msmarco_sample_autorag | [
"region:us"
] | 2024-02-03T21:46:46+00:00 | {"configs": [{"config_name": "qa", "splits": [{"name": "train", "data_files": "qa_train.parquet"}, {"name": "test", "data_files": "qa_test.parquet"}]}, {"config_name": "corpus", "data_files": "corpus.parquet"}]} | 2024-02-06T10:06:28+00:00 | [] | [] | TAGS
#region-us
| license: unknown
---
| [] | [
"TAGS\n#region-us \n"
] |
6947480d59d53212c57fba58c4a4f22b600e396c | license: apache-2.0
---
| MarkrAI/dstc_sample_autorag | [
"region:us"
] | 2024-02-03T21:49:23+00:00 | {"configs": [{"config_name": "qa", "data_files": "qa.parquet"}, {"config_name": "corpus", "data_files": "corpus.parquet"}]} | 2024-02-04T14:40:56+00:00 | [] | [] | TAGS
#region-us
| license: apache-2.0
---
| [] | [
"TAGS\n#region-us \n"
] |
c630ef1f7ba3d3fff92ca86b6e76bb083eb11928 |
# Dataset Card for Evaluation run of sarahlintang/mistral-indo-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sarahlintang/mistral-indo-7b](https://huggingface.co/sarahlintang/mistral-indo-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sarahlintang__mistral-indo-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T21:47:22.078031](https://huggingface.co/datasets/open-llm-leaderboard/details_sarahlintang__mistral-indo-7b/blob/main/results_2024-02-03T21-47-22.078031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6261916134620902,
"acc_stderr": 0.032466151879964746,
"acc_norm": 0.6326766330349893,
"acc_norm_stderr": 0.033132950603814465,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.423359772241798,
"mc2_stderr": 0.014387833736527744
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.014456862944650647,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6156144194383589,
"acc_stderr": 0.0048545552940175585,
"acc_norm": 0.8118900617406891,
"acc_norm_stderr": 0.0039000125049579717
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343138,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343138
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922524,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.01581390128391305,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.01581390128391305
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983964,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983964
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578656,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578656
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.423359772241798,
"mc2_stderr": 0.014387833736527744
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409347
},
"harness|gsm8k|5": {
"acc": 0.3206974981046247,
"acc_stderr": 0.012856468433722302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sarahlintang__mistral-indo-7b | [
"region:us"
] | 2024-02-03T21:49:44+00:00 | {"pretty_name": "Evaluation run of sarahlintang/mistral-indo-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [sarahlintang/mistral-indo-7b](https://huggingface.co/sarahlintang/mistral-indo-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sarahlintang__mistral-indo-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T21:47:22.078031](https://huggingface.co/datasets/open-llm-leaderboard/details_sarahlintang__mistral-indo-7b/blob/main/results_2024-02-03T21-47-22.078031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6261916134620902,\n \"acc_stderr\": 0.032466151879964746,\n \"acc_norm\": 0.6326766330349893,\n \"acc_norm_stderr\": 0.033132950603814465,\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.423359772241798,\n \"mc2_stderr\": 0.014387833736527744\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650647,\n \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6156144194383589,\n \"acc_stderr\": 0.0048545552940175585,\n \"acc_norm\": 0.8118900617406891,\n \"acc_norm_stderr\": 0.0039000125049579717\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343138,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343138\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922524,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922524\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n \"acc_stderr\": 0.01581390128391305,\n \"acc_norm\": 0.33743016759776534,\n \"acc_norm_stderr\": 0.01581390128391305\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n \"acc_stderr\": 0.012656810383983964,\n \"acc_norm\": 0.4335071707953064,\n \"acc_norm_stderr\": 0.012656810383983964\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578656,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578656\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.423359772241798,\n \"mc2_stderr\": 0.014387833736527744\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409347\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3206974981046247,\n \"acc_stderr\": 0.012856468433722302\n }\n}\n```", "repo_url": "https://huggingface.co/sarahlintang/mistral-indo-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|arc:challenge|25_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|gsm8k|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hellaswag|10_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T21-47-22.078031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["**/details_harness|winogrande|5_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T21-47-22.078031.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T21_47_22.078031", "path": ["results_2024-02-03T21-47-22.078031.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T21-47-22.078031.parquet"]}]}]} | 2024-02-03T21:50:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sarahlintang/mistral-indo-7b
Dataset automatically created during the evaluation run of model sarahlintang/mistral-indo-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T21:47:22.078031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sarahlintang/mistral-indo-7b\n\n\n\nDataset automatically created during the evaluation run of model sarahlintang/mistral-indo-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T21:47:22.078031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sarahlintang/mistral-indo-7b\n\n\n\nDataset automatically created during the evaluation run of model sarahlintang/mistral-indo-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T21:47:22.078031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3771d40a338660c25123518c2089e2c06ec7a756 |
# Dataset Card for Evaluation run of Obrolin/Kesehatan-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Obrolin/Kesehatan-7B-v0.1](https://huggingface.co/Obrolin/Kesehatan-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Obrolin__Kesehatan-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T22:00:30.966054](https://huggingface.co/datasets/open-llm-leaderboard/details_Obrolin__Kesehatan-7B-v0.1/blob/main/results_2024-02-03T22-00-30.966054.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5977471674915434,
"acc_stderr": 0.03354675339314637,
"acc_norm": 0.6033162868765426,
"acc_norm_stderr": 0.03424412262997995,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5067930984526436,
"mc2_stderr": 0.015515560312684274
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.014413988396996081,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180635
},
"harness|hellaswag|10": {
"acc": 0.6266679944234216,
"acc_stderr": 0.004827006520802886,
"acc_norm": 0.8254331806413066,
"acc_norm_stderr": 0.0037882037293466985
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777474,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777474
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113728,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113728
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790222,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790222
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955924,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955924
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.015190473717037495,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.015190473717037495
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31508379888268156,
"acc_stderr": 0.015536850852473631,
"acc_norm": 0.31508379888268156,
"acc_norm_stderr": 0.015536850852473631
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.027305308076274695,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.027305308076274695
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.01259950560833646,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.01259950560833646
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.019821843688271765,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.019821843688271765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872475,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5067930984526436,
"mc2_stderr": 0.015515560312684274
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650872
},
"harness|gsm8k|5": {
"acc": 0.32221379833206976,
"acc_stderr": 0.01287243548118878
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Obrolin__Kesehatan-7B-v0.1 | [
"region:us"
] | 2024-02-03T22:02:50+00:00 | {"pretty_name": "Evaluation run of Obrolin/Kesehatan-7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Obrolin/Kesehatan-7B-v0.1](https://huggingface.co/Obrolin/Kesehatan-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Obrolin__Kesehatan-7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T22:00:30.966054](https://huggingface.co/datasets/open-llm-leaderboard/details_Obrolin__Kesehatan-7B-v0.1/blob/main/results_2024-02-03T22-00-30.966054.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5977471674915434,\n \"acc_stderr\": 0.03354675339314637,\n \"acc_norm\": 0.6033162868765426,\n \"acc_norm_stderr\": 0.03424412262997995,\n \"mc1\": 0.35128518971848227,\n \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5067930984526436,\n \"mc2_stderr\": 0.015515560312684274\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996081,\n \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180635\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6266679944234216,\n \"acc_stderr\": 0.004827006520802886,\n \"acc_norm\": 0.8254331806413066,\n \"acc_norm_stderr\": 0.0037882037293466985\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777474,\n \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777474\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113728,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113728\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.02489246917246283,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.02489246917246283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790222,\n \"acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790222\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955924,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955924\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n \"acc_stderr\": 0.015190473717037495,\n \"acc_norm\": 0.7637292464878672,\n \"acc_norm_stderr\": 0.015190473717037495\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31508379888268156,\n \"acc_stderr\": 0.015536850852473631,\n \"acc_norm\": 0.31508379888268156,\n \"acc_norm_stderr\": 0.015536850852473631\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.027305308076274695,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.027305308076274695\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n \"acc_stderr\": 0.01259950560833646,\n \"acc_norm\": 0.41851368970013036,\n \"acc_norm_stderr\": 0.01259950560833646\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928007,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928007\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5996732026143791,\n \"acc_stderr\": 0.019821843688271765,\n \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.019821843688271765\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872475,\n \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5067930984526436,\n \"mc2_stderr\": 0.015515560312684274\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650872\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32221379833206976,\n \"acc_stderr\": 0.01287243548118878\n }\n}\n```", "repo_url": "https://huggingface.co/Obrolin/Kesehatan-7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|arc:challenge|25_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|gsm8k|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hellaswag|10_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T22-00-30.966054.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["**/details_harness|winogrande|5_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T22-00-30.966054.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T22_00_30.966054", "path": ["results_2024-02-03T22-00-30.966054.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T22-00-30.966054.parquet"]}]}]} | 2024-02-03T22:03:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Obrolin/Kesehatan-7B-v0.1
Dataset automatically created during the evaluation run of model Obrolin/Kesehatan-7B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T22:00:30.966054(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Obrolin/Kesehatan-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model Obrolin/Kesehatan-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T22:00:30.966054(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Obrolin/Kesehatan-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model Obrolin/Kesehatan-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T22:00:30.966054(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
580f7e9beddc37f27c82fd4052d414dfb7cd95b8 |
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k](https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v17.3-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T22:04:01.545858](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v17.3-32k/blob/main/results_2024-02-03T22-04-01.545858.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6917313320960925,
"acc_stderr": 0.030694742722936492,
"acc_norm": 0.6985909537220241,
"acc_norm_stderr": 0.03127101232040077,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248167,
"mc2": 0.5913829725730599,
"mc2_stderr": 0.015218394186481066
},
"harness|arc:challenge|25": {
"acc": 0.6151877133105802,
"acc_stderr": 0.014218371065251104,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094094
},
"harness|hellaswag|10": {
"acc": 0.5156343357896833,
"acc_stderr": 0.004987341485856662,
"acc_norm": 0.6695877315275841,
"acc_norm_stderr": 0.004694002781939541
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343603,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.03496101481191179,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.03496101481191179
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.03097669299853443,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.03097669299853443
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6009852216748769,
"acc_stderr": 0.03445487686264715,
"acc_norm": 0.6009852216748769,
"acc_norm_stderr": 0.03445487686264715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603627,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.023661296393964273,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.023661296393964273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7983193277310925,
"acc_stderr": 0.02606431340630452,
"acc_norm": 0.7983193277310925,
"acc_norm_stderr": 0.02606431340630452
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8660550458715597,
"acc_stderr": 0.014602811435592635,
"acc_norm": 0.8660550458715597,
"acc_norm_stderr": 0.014602811435592635
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878453,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878453
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752597,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752597
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002157,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002157
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.035865947385739734,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.035865947385739734
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.017004368568132346,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.017004368568132346
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8786717752234994,
"acc_stderr": 0.011675913883906736,
"acc_norm": 0.8786717752234994,
"acc_norm_stderr": 0.011675913883906736
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45027932960893857,
"acc_stderr": 0.01663961523684581,
"acc_norm": 0.45027932960893857,
"acc_norm_stderr": 0.01663961523684581
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.02273378940544758,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.02273378940544758
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8179012345679012,
"acc_stderr": 0.02147349183480833,
"acc_norm": 0.8179012345679012,
"acc_norm_stderr": 0.02147349183480833
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.516297262059974,
"acc_stderr": 0.01276345073469981,
"acc_norm": 0.516297262059974,
"acc_norm_stderr": 0.01276345073469981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.02604066247420126,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.02604066247420126
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.017401816711427657,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.017401816711427657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.02635891633490403,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.02635891633490403
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.028782108105401712,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.028782108105401712
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248167,
"mc2": 0.5913829725730599,
"mc2_stderr": 0.015218394186481066
},
"harness|winogrande|5": {
"acc": 0.681136543014996,
"acc_stderr": 0.013097928420088771
},
"harness|gsm8k|5": {
"acc": 0.48142532221379836,
"acc_stderr": 0.013762977910317581
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v17.3-32k | [
"region:us"
] | 2024-02-03T22:06:20+00:00 | {"pretty_name": "Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k](https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v17.3-32k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T22:04:01.545858](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v17.3-32k/blob/main/results_2024-02-03T22-04-01.545858.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6917313320960925,\n \"acc_stderr\": 0.030694742722936492,\n \"acc_norm\": 0.6985909537220241,\n \"acc_norm_stderr\": 0.03127101232040077,\n \"mc1\": 0.4222766217870257,\n \"mc1_stderr\": 0.017290733254248167,\n \"mc2\": 0.5913829725730599,\n \"mc2_stderr\": 0.015218394186481066\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6151877133105802,\n \"acc_stderr\": 0.014218371065251104,\n \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094094\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5156343357896833,\n \"acc_stderr\": 0.004987341485856662,\n \"acc_norm\": 0.6695877315275841,\n \"acc_norm_stderr\": 0.004694002781939541\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343603,\n \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343603\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.03496101481191179,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.03496101481191179\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.03097669299853443,\n \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.03097669299853443\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n \"acc_stderr\": 0.021576248184514587,\n \"acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.021576248184514587\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6009852216748769,\n \"acc_stderr\": 0.03445487686264715,\n \"acc_norm\": 0.6009852216748769,\n \"acc_norm_stderr\": 0.03445487686264715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603627,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.023661296393964273,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.023661296393964273\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7983193277310925,\n \"acc_stderr\": 0.02606431340630452,\n \"acc_norm\": 0.7983193277310925,\n \"acc_norm_stderr\": 0.02606431340630452\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4370860927152318,\n \"acc_stderr\": 0.040500357222306355,\n \"acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.040500357222306355\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8660550458715597,\n \"acc_stderr\": 0.014602811435592635,\n \"acc_norm\": 0.8660550458715597,\n \"acc_norm_stderr\": 0.014602811435592635\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878453,\n \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878453\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752597,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752597\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002157,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002157\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739734,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739734\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.017004368568132346,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.017004368568132346\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8786717752234994,\n \"acc_stderr\": 0.011675913883906736,\n \"acc_norm\": 0.8786717752234994,\n \"acc_norm_stderr\": 0.011675913883906736\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n \"acc_stderr\": 0.01663961523684581,\n \"acc_norm\": 0.45027932960893857,\n \"acc_norm_stderr\": 0.01663961523684581\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.02273378940544758,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.02273378940544758\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8179012345679012,\n \"acc_stderr\": 0.02147349183480833,\n \"acc_norm\": 0.8179012345679012,\n \"acc_norm_stderr\": 0.02147349183480833\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.516297262059974,\n \"acc_stderr\": 0.01276345073469981,\n \"acc_norm\": 0.516297262059974,\n \"acc_norm_stderr\": 0.01276345073469981\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.02604066247420126,\n \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.02604066247420126\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427657,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.02635891633490403,\n \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.02635891633490403\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401712,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401712\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n \"mc1_stderr\": 0.017290733254248167,\n \"mc2\": 0.5913829725730599,\n \"mc2_stderr\": 0.015218394186481066\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.681136543014996,\n \"acc_stderr\": 0.013097928420088771\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.48142532221379836,\n \"acc_stderr\": 0.013762977910317581\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|arc:challenge|25_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|gsm8k|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hellaswag|10_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T22-04-01.545858.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["**/details_harness|winogrande|5_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T22-04-01.545858.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T22_04_01.545858", "path": ["results_2024-02-03T22-04-01.545858.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T22-04-01.545858.parquet"]}]}]} | 2024-02-03T22:06:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k
Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T22:04:01.545858(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T22:04:01.545858(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-mixtral-7bx8-v17.3-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T22:04:01.545858(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
252d9c7c4e56e704fc54b95c61df4ab7fb385883 |
# Dataset Card for Evaluation run of Inv/Konstanta-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Inv/Konstanta-7B](https://huggingface.co/Inv/Konstanta-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Inv__Konstanta-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T22:20:39.657096](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-7B/blob/main/results_2024-02-03T22-20-39.657096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6557091429600252,
"acc_stderr": 0.03198588266987017,
"acc_norm": 0.6552281106519231,
"acc_norm_stderr": 0.032650421126591875,
"mc1": 0.48714810281517745,
"mc1_stderr": 0.017497717944299822,
"mc2": 0.6542664114049395,
"mc2_stderr": 0.014950723151719138
},
"harness|arc:challenge|25": {
"acc": 0.6791808873720137,
"acc_stderr": 0.013640943091946531,
"acc_norm": 0.7005119453924915,
"acc_norm_stderr": 0.013385021637313577
},
"harness|hellaswag|10": {
"acc": 0.704142601075483,
"acc_stderr": 0.004554944020620486,
"acc_norm": 0.8750248954391555,
"acc_norm_stderr": 0.003300148445609133
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512624,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.02485636418450322,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.02485636418450322
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741622,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741622
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210254,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210254
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48714810281517745,
"mc1_stderr": 0.017497717944299822,
"mc2": 0.6542664114049395,
"mc2_stderr": 0.014950723151719138
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.010759352014855932
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.012493927348659629
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Inv__Konstanta-7B | [
"region:us"
] | 2024-02-03T22:22:57+00:00 | {"pretty_name": "Evaluation run of Inv/Konstanta-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Inv/Konstanta-7B](https://huggingface.co/Inv/Konstanta-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Inv__Konstanta-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T22:20:39.657096](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-7B/blob/main/results_2024-02-03T22-20-39.657096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6557091429600252,\n \"acc_stderr\": 0.03198588266987017,\n \"acc_norm\": 0.6552281106519231,\n \"acc_norm_stderr\": 0.032650421126591875,\n \"mc1\": 0.48714810281517745,\n \"mc1_stderr\": 0.017497717944299822,\n \"mc2\": 0.6542664114049395,\n \"mc2_stderr\": 0.014950723151719138\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6791808873720137,\n \"acc_stderr\": 0.013640943091946531,\n \"acc_norm\": 0.7005119453924915,\n \"acc_norm_stderr\": 0.013385021637313577\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.704142601075483,\n \"acc_stderr\": 0.004554944020620486,\n \"acc_norm\": 0.8750248954391555,\n \"acc_norm_stderr\": 0.003300148445609133\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512624,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512624\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741622,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741622\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n \"acc_stderr\": 0.016476342210254,\n \"acc_norm\": 0.4145251396648045,\n \"acc_norm_stderr\": 0.016476342210254\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48714810281517745,\n \"mc1_stderr\": 0.017497717944299822,\n \"mc2\": 0.6542664114049395,\n \"mc2_stderr\": 0.014950723151719138\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855932\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \"acc_stderr\": 0.012493927348659629\n }\n}\n```", "repo_url": "https://huggingface.co/Inv/Konstanta-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|arc:challenge|25_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|gsm8k|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hellaswag|10_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T22-20-39.657096.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["**/details_harness|winogrande|5_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T22-20-39.657096.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T22_20_39.657096", "path": ["results_2024-02-03T22-20-39.657096.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T22-20-39.657096.parquet"]}]}]} | 2024-02-03T22:23:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Inv/Konstanta-7B
Dataset automatically created during the evaluation run of model Inv/Konstanta-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T22:20:39.657096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Inv/Konstanta-7B\n\n\n\nDataset automatically created during the evaluation run of model Inv/Konstanta-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T22:20:39.657096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Inv/Konstanta-7B\n\n\n\nDataset automatically created during the evaluation run of model Inv/Konstanta-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T22:20:39.657096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
cd2027d1705bdc73e46e547a5c8af605d574c775 | Applications .apk files can be requested through this form.<br>
[Requst .apk files](https://forms.office.com/pages/responsepage.aspx?id=P61NLa5Q2UeDoJrisfRm-J9OUpsC3GtDhm92SdH8b41UNE9GT05IWFUxQ0VRQ1JQRTE2S1lMNFFIMi4u)
A quick usage example of MobileConvRec dataset.
### install datasets library
%pip install datasets
### Reading the Dataset
```mbr_conv_rec_ds = load_dataset('recmeapp/MobileConvRec')```
### Reading the App MetaData
```app_metadata = load_dataset('recmeapp/MobileConvRec', data_dir='app_meta')```
### How many dialogs are there in different splits?
```
train_data = mbr_conv_rec_ds['train']
valid_data = mbr_conv_rec_ds['validation']
test_data = mbr_conv_rec_ds['test']
print(f'There are {len(train_data)} dialogs in train split')
print(f'There are {len(valid_data)} dialogs in train split')
print(f'There are {len(test_data)} dialogs in train split')
```
<b>The output of the above snippet is:</b><br>
There are 19195 dialogs in train split<br>
There are 5215 dialogs in valid split<br>
There are 3290 dialogs in test split<br>
#### visualize the train/valid/test splits
```python:
print(mbr_conv_rec_ds)
```
above snippet will show the following output <br>
```
DatasetDict({
train: Dataset({
features: ['user_id', 'user_previous_interactions', 'recommended_app', 'turns'],
num_rows: 19195
})
validation: Dataset({
features: ['user_id', 'user_previous_interactions', 'recommended_app', 'turns'],
num_rows: 5215
})
test: Dataset({
features: ['user_id', 'user_previous_interactions', 'recommended_app', 'turns'],
num_rows: 3290
})
})
```
#### Visualize the app metadata object
```
print(app_meta_data)
DatasetDict({
train: Dataset({
features: ['app_package', 'app_name', 'developer_name', 'app_category', 'description', 'content_rating', 'num_reviews', 'price', 'avg_rating', 'security_practices', 'does_have_ads', 'permission', 'data_shared', 'data_collected'],
num_rows: 2308
})
})
```
### Reading records from the dataset
#### Reading a single document upto recommendation turn
```python:
# from a single document, get all the turns upto the turn with recommendation
dialog_upto_recom_turn = []
for t in train_data[0]['turns']:
if t['recommendation_turn'] == False:
#non recommendation turn
dialog_upto_recom_turn.append(t)
else:
# recommendation turn
dialog_upto_recom_turn.append(t)
break
```
| recmeapp/MobileConvRec | [
"task_categories:conversational",
"region:us"
] | 2024-02-03T22:31:55+00:00 | {"task_categories": ["conversational"], "pretty_name": "Conversational Recommender Systems"} | 2024-02-09T18:54:16+00:00 | [] | [] | TAGS
#task_categories-conversational #region-us
| Applications .apk files can be requested through this form.<br>
Requst .apk files
A quick usage example of MobileConvRec dataset.
### install datasets library
%pip install datasets
### Reading the Dataset
### Reading the App MetaData
### How many dialogs are there in different splits?
<b>The output of the above snippet is:</b><br>
There are 19195 dialogs in train split<br>
There are 5215 dialogs in valid split<br>
There are 3290 dialogs in test split<br>
#### visualize the train/valid/test splits
above snippet will show the following output <br>
#### Visualize the app metadata object
### Reading records from the dataset
#### Reading a single document upto recommendation turn
| [
"### install datasets library\n%pip install datasets",
"### Reading the Dataset",
"### Reading the App MetaData",
"### How many dialogs are there in different splits?\n\n\n<b>The output of the above snippet is:</b><br>\nThere are 19195 dialogs in train split<br>\nThere are 5215 dialogs in valid split<br>\nThere are 3290 dialogs in test split<br>",
"#### visualize the train/valid/test splits\n\n\nabove snippet will show the following output <br>",
"#### Visualize the app metadata object",
"### Reading records from the dataset",
"#### Reading a single document upto recommendation turn"
] | [
"TAGS\n#task_categories-conversational #region-us \n",
"### install datasets library\n%pip install datasets",
"### Reading the Dataset",
"### Reading the App MetaData",
"### How many dialogs are there in different splits?\n\n\n<b>The output of the above snippet is:</b><br>\nThere are 19195 dialogs in train split<br>\nThere are 5215 dialogs in valid split<br>\nThere are 3290 dialogs in test split<br>",
"#### visualize the train/valid/test splits\n\n\nabove snippet will show the following output <br>",
"#### Visualize the app metadata object",
"### Reading records from the dataset",
"#### Reading a single document upto recommendation turn"
] |
ce20fb8a3e174677068f7f0e34b1a1f193908315 |
# Dataset Card for Evaluation run of fhai50032/BeagleLake-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fhai50032/BeagleLake-7B](https://huggingface.co/fhai50032/BeagleLake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fhai50032__BeagleLake-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T22:36:10.253997](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__BeagleLake-7B/blob/main/results_2024-02-03T22-36-10.253997.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6468122209110102,
"acc_stderr": 0.032180093524919205,
"acc_norm": 0.6474338424936177,
"acc_norm_stderr": 0.03283953683706756,
"mc1": 0.4908200734394125,
"mc1_stderr": 0.01750055072481975,
"mc2": 0.6491975727907208,
"mc2_stderr": 0.015417969486375667
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441374,
"acc_norm": 0.7039249146757679,
"acc_norm_stderr": 0.01334091608524625
},
"harness|hellaswag|10": {
"acc": 0.6981676956781517,
"acc_stderr": 0.004581147247963204,
"acc_norm": 0.8738299143596893,
"acc_norm_stderr": 0.0033136235601649304
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590158,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590158
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903347,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570765,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570765
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4908200734394125,
"mc1_stderr": 0.01750055072481975,
"mc2": 0.6491975727907208,
"mc2_stderr": 0.015417969486375667
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166746
},
"harness|gsm8k|5": {
"acc": 0.6391205458680819,
"acc_stderr": 0.013228626753925147
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fhai50032__BeagleLake-7B | [
"region:us"
] | 2024-02-03T22:38:30+00:00 | {"pretty_name": "Evaluation run of fhai50032/BeagleLake-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [fhai50032/BeagleLake-7B](https://huggingface.co/fhai50032/BeagleLake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fhai50032__BeagleLake-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T22:36:10.253997](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__BeagleLake-7B/blob/main/results_2024-02-03T22-36-10.253997.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6468122209110102,\n \"acc_stderr\": 0.032180093524919205,\n \"acc_norm\": 0.6474338424936177,\n \"acc_norm_stderr\": 0.03283953683706756,\n \"mc1\": 0.4908200734394125,\n \"mc1_stderr\": 0.01750055072481975,\n \"mc2\": 0.6491975727907208,\n \"mc2_stderr\": 0.015417969486375667\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441374,\n \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.01334091608524625\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6981676956781517,\n \"acc_stderr\": 0.004581147247963204,\n \"acc_norm\": 0.8738299143596893,\n \"acc_norm_stderr\": 0.0033136235601649304\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590158,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590158\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.016251139711570765,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.016251139711570765\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4908200734394125,\n \"mc1_stderr\": 0.01750055072481975,\n \"mc2\": 0.6491975727907208,\n \"mc2_stderr\": 0.015417969486375667\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166746\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6391205458680819,\n \"acc_stderr\": 0.013228626753925147\n }\n}\n```", "repo_url": "https://huggingface.co/fhai50032/BeagleLake-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|arc:challenge|25_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|gsm8k|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hellaswag|10_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T22-36-10.253997.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["**/details_harness|winogrande|5_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T22-36-10.253997.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T22_36_10.253997", "path": ["results_2024-02-03T22-36-10.253997.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T22-36-10.253997.parquet"]}]}]} | 2024-02-03T22:38:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fhai50032/BeagleLake-7B
Dataset automatically created during the evaluation run of model fhai50032/BeagleLake-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T22:36:10.253997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fhai50032/BeagleLake-7B\n\n\n\nDataset automatically created during the evaluation run of model fhai50032/BeagleLake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T22:36:10.253997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fhai50032/BeagleLake-7B\n\n\n\nDataset automatically created during the evaluation run of model fhai50032/BeagleLake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T22:36:10.253997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
695cb6315cb5c2480386a5edcb1ecd91c0025cc2 |
A small/complete web scrape of the old qysh.me website now hosted on dua.com. | benxh/qysh-me-shqip-scrape-dataset | [
"task_categories:text-generation",
"size_categories:n<1K",
"language:sq",
"license:apache-2.0",
"albanian",
"sq",
"al",
"al_sq",
"scrape",
"region:us"
] | 2024-02-03T23:12:30+00:00 | {"language": ["sq"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["text-generation"], "tags": ["albanian", "sq", "al", "al_sq", "scrape"]} | 2024-02-03T23:16:03+00:00 | [] | [
"sq"
] | TAGS
#task_categories-text-generation #size_categories-n<1K #language-Albanian #license-apache-2.0 #albanian #sq #al #al_sq #scrape #region-us
|
A small/complete web scrape of the old URL website now hosted on URL. | [] | [
"TAGS\n#task_categories-text-generation #size_categories-n<1K #language-Albanian #license-apache-2.0 #albanian #sq #al #al_sq #scrape #region-us \n"
] |
40aba8cb40a1dde6ce8bccab8c072e252d033fed |
# Dataset Card for Evaluation run of g-ronimo/phi-2-OpenHermes-2.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [g-ronimo/phi-2-OpenHermes-2.5](https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T23:27:26.780364](https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5/blob/main/results_2024-02-03T23-27-26.780364.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.55640741934187,
"acc_stderr": 0.033971441894622756,
"acc_norm": 0.5591434075224653,
"acc_norm_stderr": 0.034674408120535544,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4386157539402909,
"mc2_stderr": 0.015167060058989778
},
"harness|arc:challenge|25": {
"acc": 0.5716723549488054,
"acc_stderr": 0.014460496367599015,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578274
},
"harness|hellaswag|10": {
"acc": 0.5640310695080661,
"acc_stderr": 0.004948696280312428,
"acc_norm": 0.7484564827723561,
"acc_norm_stderr": 0.004330134219762838
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246497,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246497
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534323,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534323
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512566,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512566
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.02977866303775295,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.02977866303775295
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.025310639254933886,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.025310639254933886
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230172,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230172
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.032284106267163895,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.032284106267163895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.781651376146789,
"acc_stderr": 0.017712600528722727,
"acc_norm": 0.781651376146789,
"acc_norm_stderr": 0.017712600528722727
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.03374499356319354,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.03374499356319354
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6704980842911877,
"acc_stderr": 0.016808322261740477,
"acc_norm": 0.6704980842911877,
"acc_norm_stderr": 0.016808322261740477
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306386,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306386
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2122905027932961,
"acc_stderr": 0.013676644685831714,
"acc_norm": 0.2122905027932961,
"acc_norm_stderr": 0.013676644685831714
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809065,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934016,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934016
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.027201117666925654,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.027201117666925654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370604,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370604
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4132985658409387,
"acc_stderr": 0.012576779494860085,
"acc_norm": 0.4132985658409387,
"acc_norm_stderr": 0.012576779494860085
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492534,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492534
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555403,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555403
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036155076303109365,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036155076303109365
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4386157539402909,
"mc2_stderr": 0.015167060058989778
},
"harness|winogrande|5": {
"acc": 0.7505919494869772,
"acc_stderr": 0.012160189196930696
},
"harness|gsm8k|5": {
"acc": 0.4116755117513268,
"acc_stderr": 0.01355589744989005
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5 | [
"region:us"
] | 2024-02-03T23:29:12+00:00 | {"pretty_name": "Evaluation run of g-ronimo/phi-2-OpenHermes-2.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [g-ronimo/phi-2-OpenHermes-2.5](https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T23:27:26.780364](https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5/blob/main/results_2024-02-03T23-27-26.780364.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.55640741934187,\n \"acc_stderr\": 0.033971441894622756,\n \"acc_norm\": 0.5591434075224653,\n \"acc_norm_stderr\": 0.034674408120535544,\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4386157539402909,\n \"mc2_stderr\": 0.015167060058989778\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5716723549488054,\n \"acc_stderr\": 0.014460496367599015,\n \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578274\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5640310695080661,\n \"acc_stderr\": 0.004948696280312428,\n \"acc_norm\": 0.7484564827723561,\n \"acc_norm_stderr\": 0.004330134219762838\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246497,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246497\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n \"acc_stderr\": 0.027045746573534323,\n \"acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.027045746573534323\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512566,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512566\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.02977866303775295,\n \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.02977866303775295\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933886,\n \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933886\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230172,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230172\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.032284106267163895,\n \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.032284106267163895\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.781651376146789,\n \"acc_stderr\": 0.017712600528722727,\n \"acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722727\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.03374499356319354,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.03374499356319354\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6704980842911877,\n \"acc_stderr\": 0.016808322261740477,\n \"acc_norm\": 0.6704980842911877,\n \"acc_norm_stderr\": 0.016808322261740477\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306386,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306386\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2122905027932961,\n \"acc_stderr\": 0.013676644685831714,\n \"acc_norm\": 0.2122905027932961,\n \"acc_norm_stderr\": 0.013676644685831714\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809065,\n \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809065\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n \"acc_stderr\": 0.027210420375934016,\n \"acc_norm\": 0.6430868167202572,\n \"acc_norm_stderr\": 0.027210420375934016\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925654,\n \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925654\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370604,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370604\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4132985658409387,\n \"acc_stderr\": 0.012576779494860085,\n \"acc_norm\": 0.4132985658409387,\n \"acc_norm_stderr\": 0.012576779494860085\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877746,\n \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492534,\n \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492534\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555403,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555403\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036155076303109365,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036155076303109365\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4386157539402909,\n \"mc2_stderr\": 0.015167060058989778\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.012160189196930696\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4116755117513268,\n \"acc_stderr\": 0.01355589744989005\n }\n}\n```", "repo_url": "https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|arc:challenge|25_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|gsm8k|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hellaswag|10_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T23-27-26.780364.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["**/details_harness|winogrande|5_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T23-27-26.780364.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T23_27_26.780364", "path": ["results_2024-02-03T23-27-26.780364.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T23-27-26.780364.parquet"]}]}]} | 2024-02-03T23:29:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of g-ronimo/phi-2-OpenHermes-2.5
Dataset automatically created during the evaluation run of model g-ronimo/phi-2-OpenHermes-2.5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T23:27:26.780364(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of g-ronimo/phi-2-OpenHermes-2.5\n\n\n\nDataset automatically created during the evaluation run of model g-ronimo/phi-2-OpenHermes-2.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T23:27:26.780364(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of g-ronimo/phi-2-OpenHermes-2.5\n\n\n\nDataset automatically created during the evaluation run of model g-ronimo/phi-2-OpenHermes-2.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T23:27:26.780364(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
44b76a71ee3fd08bd89c88dcdf63b63ead3a1239 | licensed under Hippocratic License HL3-CL-ECO-EXTR
[](https://firstdonoharm.dev/version/3/0/cl-eco-extr.html)
Data is in ENGLISH. NOT KOREAN LANGUAGE. Rather, it details Korean Natural Farming recipes from the Korean Natural Farming tradition of regenerative agriculture farming practices and Indigenous Knowledge System, in English.
This Dataset was constructed by Caleb DeLeeuw (Solshine) using a langchain RAG embeddings system over the famous Natural Farming Text "Dr. Cho's Global Natural Farming" to extract the key fertilizer recipes as data chunks.
The dataset is in json. | Solshine/Natural_Farming_Recipes_Datachunks | [
"license:other",
"biology",
"climate",
"region:us"
] | 2024-02-03T23:31:20+00:00 | {"license": "other", "pretty_name": "Natural Farming Fertilizer Recipes", "tags": ["biology", "climate"]} | 2024-02-08T07:19:25+00:00 | [] | [] | TAGS
#license-other #biology #climate #region-us
| licensed under Hippocratic License HL3-CL-ECO-EXTR
 using a langchain RAG embeddings system over the famous Natural Farming Text "Dr. Cho's Global Natural Farming" to extract the key fertilizer recipes as data chunks.
The dataset is in json. | [] | [
"TAGS\n#license-other #biology #climate #region-us \n"
] |
a1d45399c82ecebf3685c262f8e80872a01f2dec |
# Dataset Card for Evaluation run of Weyaxi/Einstein-v2-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-v2-7B](https://huggingface.co/Weyaxi/Einstein-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Einstein-v2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T00:18:54.790433](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-v2-7B/blob/main/results_2024-02-04T00-18-54.790433.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6203800189560776,
"acc_stderr": 0.032564602290854144,
"acc_norm": 0.6244404890114698,
"acc_norm_stderr": 0.033222709483401835,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.01654241280949489,
"mc2": 0.5052388587667219,
"mc2_stderr": 0.014940162719394304
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009124,
"acc_norm": 0.6237201365187713,
"acc_norm_stderr": 0.014157022555407154
},
"harness|hellaswag|10": {
"acc": 0.6419040031866162,
"acc_stderr": 0.0047846072227746405,
"acc_norm": 0.8345947022505477,
"acc_norm_stderr": 0.0037078660457296035
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113729,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113729
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919426,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.024838811988033165,
"acc_norm": 0.6,
"acc_norm_stderr": 0.024838811988033165
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091098,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091098
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922524,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.02500931379006971,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.02500931379006971
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.016421670506339185,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.016421670506339185
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303954,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303954
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.01939305840235544,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.01939305840235544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.01654241280949489,
"mc2": 0.5052388587667219,
"mc2_stderr": 0.014940162719394304
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235807
},
"harness|gsm8k|5": {
"acc": 0.4313874147081122,
"acc_stderr": 0.013642195352511571
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__einstein-v2-test-model | [
"region:us"
] | 2024-02-04T00:21:13+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Einstein-v2-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-v2-7B](https://huggingface.co/Weyaxi/Einstein-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Einstein-v2-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T00:18:54.790433](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-v2-7B/blob/main/results_2024-02-04T00-18-54.790433.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6203800189560776,\n \"acc_stderr\": 0.032564602290854144,\n \"acc_norm\": 0.6244404890114698,\n \"acc_norm_stderr\": 0.033222709483401835,\n \"mc1\": 0.33659730722154224,\n \"mc1_stderr\": 0.01654241280949489,\n \"mc2\": 0.5052388587667219,\n \"mc2_stderr\": 0.014940162719394304\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009124,\n \"acc_norm\": 0.6237201365187713,\n \"acc_norm_stderr\": 0.014157022555407154\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6419040031866162,\n \"acc_stderr\": 0.0047846072227746405,\n \"acc_norm\": 0.8345947022505477,\n \"acc_norm_stderr\": 0.0037078660457296035\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113729,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113729\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919426,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919426\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.024838811988033165,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.024838811988033165\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091098,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091098\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922524,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922524\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.02500931379006971,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.02500931379006971\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n \"acc_stderr\": 0.016421670506339185,\n \"acc_norm\": 0.40558659217877097,\n \"acc_norm_stderr\": 0.016421670506339185\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n \"acc_stderr\": 0.012738547371303954,\n \"acc_norm\": 0.46479791395045633,\n \"acc_norm_stderr\": 0.012738547371303954\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.01939305840235544,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.01939305840235544\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n \"mc1_stderr\": 0.01654241280949489,\n \"mc2\": 0.5052388587667219,\n \"mc2_stderr\": 0.014940162719394304\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235807\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4313874147081122,\n \"acc_stderr\": 0.013642195352511571\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Einstein-v2-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|arc:challenge|25_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|gsm8k|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hellaswag|10_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T00-18-54.790433.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["**/details_harness|winogrande|5_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T00-18-54.790433.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T00_18_54.790433", "path": ["results_2024-02-04T00-18-54.790433.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T00-18-54.790433.parquet"]}]}]} | 2024-02-05T07:45:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Einstein-v2-7B
Dataset automatically created during the evaluation run of model Weyaxi/Einstein-v2-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T00:18:54.790433(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Einstein-v2-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Einstein-v2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T00:18:54.790433(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Einstein-v2-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Einstein-v2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T00:18:54.790433(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
677352170009f9ac1bf1fca3a932a1834cde9b16 |
# Dataset Card for Evaluation run of s3nh/Severusectum-7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [s3nh/Severusectum-7B-DPO](https://huggingface.co/s3nh/Severusectum-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T00:26:53.768955](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO/blob/main/results_2024-02-04T00-26-53.768955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535295097742561,
"acc_stderr": 0.03205169461375925,
"acc_norm": 0.6531125947259314,
"acc_norm_stderr": 0.03271927818828212,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.7245391094382377,
"mc2_stderr": 0.01445327594903656
},
"harness|arc:challenge|25": {
"acc": 0.6945392491467577,
"acc_stderr": 0.013460080478002508,
"acc_norm": 0.7150170648464164,
"acc_norm_stderr": 0.013191348179838795
},
"harness|hellaswag|10": {
"acc": 0.6998605855407289,
"acc_stderr": 0.00457381716300745,
"acc_norm": 0.8854809798844852,
"acc_norm_stderr": 0.0031778979482849352
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508297,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508297
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.01637696614261008,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.01637696614261008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657473,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657473
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.7245391094382377,
"mc2_stderr": 0.01445327594903656
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954769
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO | [
"region:us"
] | 2024-02-04T00:29:12+00:00 | {"pretty_name": "Evaluation run of s3nh/Severusectum-7B-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [s3nh/Severusectum-7B-DPO](https://huggingface.co/s3nh/Severusectum-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T00:26:53.768955](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO/blob/main/results_2024-02-04T00-26-53.768955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535295097742561,\n \"acc_stderr\": 0.03205169461375925,\n \"acc_norm\": 0.6531125947259314,\n \"acc_norm_stderr\": 0.03271927818828212,\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.7245391094382377,\n \"mc2_stderr\": 0.01445327594903656\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6945392491467577,\n \"acc_stderr\": 0.013460080478002508,\n \"acc_norm\": 0.7150170648464164,\n \"acc_norm_stderr\": 0.013191348179838795\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6998605855407289,\n \"acc_stderr\": 0.00457381716300745,\n \"acc_norm\": 0.8854809798844852,\n \"acc_norm_stderr\": 0.0031778979482849352\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508297,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508297\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n \"acc_stderr\": 0.01637696614261008,\n \"acc_norm\": 0.39888268156424583,\n \"acc_norm_stderr\": 0.01637696614261008\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.7245391094382377,\n \"mc2_stderr\": 0.01445327594903656\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \"acc_stderr\": 0.012560698010954769\n }\n}\n```", "repo_url": "https://huggingface.co/s3nh/Severusectum-7B-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|arc:challenge|25_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|gsm8k|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hellaswag|10_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T00-26-53.768955.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["**/details_harness|winogrande|5_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T00-26-53.768955.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T00_26_53.768955", "path": ["results_2024-02-04T00-26-53.768955.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T00-26-53.768955.parquet"]}]}]} | 2024-02-04T00:29:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of s3nh/Severusectum-7B-DPO
Dataset automatically created during the evaluation run of model s3nh/Severusectum-7B-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T00:26:53.768955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of s3nh/Severusectum-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model s3nh/Severusectum-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T00:26:53.768955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of s3nh/Severusectum-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model s3nh/Severusectum-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T00:26:53.768955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8e5eeba9b9a7cbddea3590df25efe626e17af945 |
# Dataset Card for Evaluation run of cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16](https://huggingface.co/cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T00:31:33.025803](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16/blob/main/results_2024-02-04T00-31-33.025803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.764901672440236,
"acc_stderr": 0.02826230862515645,
"acc_norm": 0.7677453718421197,
"acc_norm_stderr": 0.02881226227160178,
"mc1": 0.5556915544675642,
"mc1_stderr": 0.017394586250743176,
"mc2": 0.7224126373641326,
"mc2_stderr": 0.014009811551091062
},
"harness|arc:challenge|25": {
"acc": 0.7218430034129693,
"acc_stderr": 0.0130944699195388,
"acc_norm": 0.7406143344709898,
"acc_norm_stderr": 0.012808273573927094
},
"harness|hellaswag|10": {
"acc": 0.6701852220673172,
"acc_stderr": 0.004691848665399069,
"acc_norm": 0.8673571001792472,
"acc_norm_stderr": 0.003384951803213475
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7617021276595745,
"acc_stderr": 0.02785125297388977,
"acc_norm": 0.7617021276595745,
"acc_norm_stderr": 0.02785125297388977
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.036001056927277696,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.036001056927277696
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7433862433862434,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.7433862433862434,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9064516129032258,
"acc_stderr": 0.016565754668270982,
"acc_norm": 0.9064516129032258,
"acc_norm_stderr": 0.016565754668270982
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8848484848484849,
"acc_stderr": 0.024925699798115344,
"acc_norm": 0.8848484848484849,
"acc_norm_stderr": 0.024925699798115344
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.011464523356953162,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.011464523356953162
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.019880165406588796,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.019880165406588796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.030296771286067323,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.030296771286067323
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.024044054940440488,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.024044054940440488
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5231788079470199,
"acc_stderr": 0.04078093859163085,
"acc_norm": 0.5231788079470199,
"acc_norm_stderr": 0.04078093859163085
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9229357798165138,
"acc_stderr": 0.011434381698911096,
"acc_norm": 0.9229357798165138,
"acc_norm_stderr": 0.011434381698911096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6712962962962963,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.6712962962962963,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.0339329572976101,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.0339329572976101
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.01500631280644693,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.01500631280644693
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.913154533844189,
"acc_stderr": 0.01007029837774778,
"acc_norm": 0.913154533844189,
"acc_norm_stderr": 0.01007029837774778
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8294797687861272,
"acc_stderr": 0.020247961569303728,
"acc_norm": 0.8294797687861272,
"acc_norm_stderr": 0.020247961569303728
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8,
"acc_stderr": 0.013378001241813072,
"acc_norm": 0.8,
"acc_norm_stderr": 0.013378001241813072
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.02009118893604371,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.02009118893604371
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764248,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764248
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.01887735383957184,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.01887735383957184
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6418439716312057,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.6418439716312057,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.590612777053455,
"acc_stderr": 0.012558780895570757,
"acc_norm": 0.590612777053455,
"acc_norm_stderr": 0.012558780895570757
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.022368672562886747,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.022368672562886747
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.01575052628436335,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.01575052628436335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8530612244897959,
"acc_stderr": 0.02266540041721764,
"acc_norm": 0.8530612244897959,
"acc_norm_stderr": 0.02266540041721764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534094,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5556915544675642,
"mc1_stderr": 0.017394586250743176,
"mc2": 0.7224126373641326,
"mc2_stderr": 0.014009811551091062
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781098
},
"harness|gsm8k|5": {
"acc": 0.7445034116755117,
"acc_stderr": 0.012013462405460067
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16 | [
"region:us"
] | 2024-02-04T00:33:48+00:00 | {"pretty_name": "Evaluation run of cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16](https://huggingface.co/cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T00:31:33.025803](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16/blob/main/results_2024-02-04T00-31-33.025803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.764901672440236,\n \"acc_stderr\": 0.02826230862515645,\n \"acc_norm\": 0.7677453718421197,\n \"acc_norm_stderr\": 0.02881226227160178,\n \"mc1\": 0.5556915544675642,\n \"mc1_stderr\": 0.017394586250743176,\n \"mc2\": 0.7224126373641326,\n \"mc2_stderr\": 0.014009811551091062\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7218430034129693,\n \"acc_stderr\": 0.0130944699195388,\n \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.012808273573927094\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6701852220673172,\n \"acc_stderr\": 0.004691848665399069,\n \"acc_norm\": 0.8673571001792472,\n \"acc_norm_stderr\": 0.003384951803213475\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.02785125297388977,\n \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.02785125297388977\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.036001056927277696,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.036001056927277696\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7433862433862434,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.7433862433862434,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9064516129032258,\n \"acc_stderr\": 0.016565754668270982,\n \"acc_norm\": 0.9064516129032258,\n \"acc_norm_stderr\": 0.016565754668270982\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8848484848484849,\n \"acc_stderr\": 0.024925699798115344,\n \"acc_norm\": 0.8848484848484849,\n \"acc_norm_stderr\": 0.024925699798115344\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.011464523356953162,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.011464523356953162\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588796,\n \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.030296771286067323,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.030296771286067323\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.024044054940440488,\n \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.024044054940440488\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5231788079470199,\n \"acc_stderr\": 0.04078093859163085,\n \"acc_norm\": 0.5231788079470199,\n \"acc_norm_stderr\": 0.04078093859163085\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9229357798165138,\n \"acc_stderr\": 0.011434381698911096,\n \"acc_norm\": 0.9229357798165138,\n \"acc_norm_stderr\": 0.011434381698911096\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6712962962962963,\n \"acc_stderr\": 0.032036140846700596,\n \"acc_norm\": 0.6712962962962963,\n \"acc_norm_stderr\": 0.032036140846700596\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.01500631280644693,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.01500631280644693\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.913154533844189,\n \"acc_stderr\": 0.01007029837774778,\n \"acc_norm\": 0.913154533844189,\n \"acc_norm_stderr\": 0.01007029837774778\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8294797687861272,\n \"acc_stderr\": 0.020247961569303728,\n \"acc_norm\": 0.8294797687861272,\n \"acc_norm_stderr\": 0.020247961569303728\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.013378001241813072,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.013378001241813072\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.02009118893604371,\n \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.02009118893604371\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n \"acc_stderr\": 0.022961339906764248,\n \"acc_norm\": 0.7942122186495176,\n \"acc_norm_stderr\": 0.022961339906764248\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.01887735383957184,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.01887735383957184\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6418439716312057,\n \"acc_stderr\": 0.028602085862759422,\n \"acc_norm\": 0.6418439716312057,\n \"acc_norm_stderr\": 0.028602085862759422\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.590612777053455,\n \"acc_stderr\": 0.012558780895570757,\n \"acc_norm\": 0.590612777053455,\n \"acc_norm_stderr\": 0.012558780895570757\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.022368672562886747,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.022368672562886747\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.01575052628436335,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.01575052628436335\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8530612244897959,\n \"acc_stderr\": 0.02266540041721764,\n \"acc_norm\": 0.8530612244897959,\n \"acc_norm_stderr\": 0.02266540041721764\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n \"acc_stderr\": 0.020687186951534094,\n \"acc_norm\": 0.9054726368159204,\n \"acc_norm_stderr\": 0.020687186951534094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5556915544675642,\n \"mc1_stderr\": 0.017394586250743176,\n \"mc2\": 0.7224126373641326,\n \"mc2_stderr\": 0.014009811551091062\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781098\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7445034116755117,\n \"acc_stderr\": 0.012013462405460067\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|arc:challenge|25_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|gsm8k|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hellaswag|10_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T00-31-33.025803.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["**/details_harness|winogrande|5_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T00-31-33.025803.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T00_31_33.025803", "path": ["results_2024-02-04T00-31-33.025803.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T00-31-33.025803.parquet"]}]}]} | 2024-02-04T00:34:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16
Dataset automatically created during the evaluation run of model cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T00:31:33.025803(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T00:31:33.025803(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T00:31:33.025803(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
95b53f68a76b4efa34722d32a0eab1d5da203776 | # Dataset Card for "processed_distilabel-math-preference-dpo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | dkshjn/processed_distilabel-math-preference-dpo | [
"region:us"
] | 2024-02-04T00:59:20+00:00 | {"dataset_info": {"features": [{"name": "metadata", "dtype": "string", "id": "metadata"}, {"name": "instruction", "dtype": "string"}, {"name": "chosen_response", "dtype": "string"}, {"name": "chosen_rating", "dtype": "float64"}, {"name": "rejected_response", "dtype": "string"}, {"name": "rejected_rating", "dtype": "float64"}, {"name": "formatted_chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "formatted_rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 13995619, "num_examples": 2418}], "download_size": 5674023, "dataset_size": 13995619}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T00:59:27+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "processed_distilabel-math-preference-dpo"
More Information needed | [
"# Dataset Card for \"processed_distilabel-math-preference-dpo\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"processed_distilabel-math-preference-dpo\"\n\nMore Information needed"
] |
cd7da2e620f3f214c57f15936eae859ca620489a |
# Dataset Card for Evaluation run of l3utterfly/tinyllama-1.1b-layla-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [l3utterfly/tinyllama-1.1b-layla-v1](https://huggingface.co/l3utterfly/tinyllama-1.1b-layla-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_l3utterfly__tinyllama-1.1b-layla-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T01:13:05.279665](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__tinyllama-1.1b-layla-v1/blob/main/results_2024-02-04T01-13-05.279665.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25360571978457686,
"acc_stderr": 0.030631027319513834,
"acc_norm": 0.2545795497909492,
"acc_norm_stderr": 0.03138700259514901,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082685,
"mc2": 0.4102690094325491,
"mc2_stderr": 0.014297501504936961
},
"harness|arc:challenge|25": {
"acc": 0.3225255972696246,
"acc_stderr": 0.01365998089427737,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.01388064457015621
},
"harness|hellaswag|10": {
"acc": 0.4600677155945031,
"acc_stderr": 0.004973842670559798,
"acc_norm": 0.5985859390559649,
"acc_norm_stderr": 0.004891826692722825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.032477811859955935,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.032477811859955935
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.02575755989310675,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.02575755989310675
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566019,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566019
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.029605623981771214,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.029605623981771214
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412424,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895514,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.02850137816789395,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.02850137816789395
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.030031147977641545,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.030031147977641545
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246794,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246794
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959912,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959912
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882392,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882392
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.01760430414925649,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.01760430414925649
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.13592233009708737,
"acc_stderr": 0.03393295729761015,
"acc_norm": 0.13592233009708737,
"acc_norm_stderr": 0.03393295729761015
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.015696008563807092,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.015696008563807092
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.02249723019096755,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.02249723019096755
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642983,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642983
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2607561929595828,
"acc_stderr": 0.01121347155960233,
"acc_norm": 0.2607561929595828,
"acc_norm_stderr": 0.01121347155960233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.023157468308559345,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.023157468308559345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.017593486895366828,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.017593486895366828
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1469387755102041,
"acc_stderr": 0.022665400417217638,
"acc_norm": 0.1469387755102041,
"acc_norm_stderr": 0.022665400417217638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.036293353299478595,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.036293353299478595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082685,
"mc2": 0.4102690094325491,
"mc2_stderr": 0.014297501504936961
},
"harness|winogrande|5": {
"acc": 0.5974743488555643,
"acc_stderr": 0.013782866831703044
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.0030152942428909443
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_l3utterfly__tinyllama-1.1b-layla-v1 | [
"region:us"
] | 2024-02-04T01:14:58+00:00 | {"pretty_name": "Evaluation run of l3utterfly/tinyllama-1.1b-layla-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [l3utterfly/tinyllama-1.1b-layla-v1](https://huggingface.co/l3utterfly/tinyllama-1.1b-layla-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_l3utterfly__tinyllama-1.1b-layla-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T01:13:05.279665](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__tinyllama-1.1b-layla-v1/blob/main/results_2024-02-04T01-13-05.279665.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25360571978457686,\n \"acc_stderr\": 0.030631027319513834,\n \"acc_norm\": 0.2545795497909492,\n \"acc_norm_stderr\": 0.03138700259514901,\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.01522589934082685,\n \"mc2\": 0.4102690094325491,\n \"mc2_stderr\": 0.014297501504936961\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3225255972696246,\n \"acc_stderr\": 0.01365998089427737,\n \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.01388064457015621\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4600677155945031,\n \"acc_stderr\": 0.004973842670559798,\n \"acc_norm\": 0.5985859390559649,\n \"acc_norm_stderr\": 0.004891826692722825\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n \"acc_stderr\": 0.032477811859955935,\n \"acc_norm\": 0.17037037037037037,\n \"acc_norm_stderr\": 0.032477811859955935\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.02575755989310675,\n \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.02575755989310675\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.03716177437566019,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.03716177437566019\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n \"acc_stderr\": 0.029605623981771214,\n \"acc_norm\": 0.18497109826589594,\n \"acc_norm_stderr\": 0.029605623981771214\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412424,\n \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n \"acc_stderr\": 0.024472243840895514,\n \"acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.024472243840895514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.02850137816789395,\n \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.02850137816789395\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.20707070707070707,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.030031147977641545,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.030031147977641545\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246794,\n \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246794\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882392,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882392\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21467889908256882,\n \"acc_stderr\": 0.01760430414925649,\n \"acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.01760430414925649\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145624,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145624\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460302,\n \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460302\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.13592233009708737,\n \"acc_stderr\": 0.03393295729761015,\n \"acc_norm\": 0.13592233009708737,\n \"acc_norm_stderr\": 0.03393295729761015\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n \"acc_stderr\": 0.015696008563807092,\n \"acc_norm\": 0.26053639846743293,\n \"acc_norm_stderr\": 0.015696008563807092\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.02249723019096755,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.02249723019096755\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.2861736334405145,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642983,\n \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642983\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2607561929595828,\n \"acc_stderr\": 0.01121347155960233,\n \"acc_norm\": 0.2607561929595828,\n \"acc_norm_stderr\": 0.01121347155960233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.023157468308559345,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.023157468308559345\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25326797385620914,\n \"acc_stderr\": 0.017593486895366828,\n \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.017593486895366828\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1469387755102041,\n \"acc_stderr\": 0.022665400417217638,\n \"acc_norm\": 0.1469387755102041,\n \"acc_norm_stderr\": 0.022665400417217638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.036293353299478595,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.036293353299478595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.01522589934082685,\n \"mc2\": 0.4102690094325491,\n \"mc2_stderr\": 0.014297501504936961\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5974743488555643,\n \"acc_stderr\": 0.013782866831703044\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \"acc_stderr\": 0.0030152942428909443\n }\n}\n```", "repo_url": "https://huggingface.co/l3utterfly/tinyllama-1.1b-layla-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|arc:challenge|25_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|gsm8k|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hellaswag|10_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T01-13-05.279665.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["**/details_harness|winogrande|5_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T01-13-05.279665.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T01_13_05.279665", "path": ["results_2024-02-04T01-13-05.279665.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T01-13-05.279665.parquet"]}]}]} | 2024-02-04T01:15:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of l3utterfly/tinyllama-1.1b-layla-v1
Dataset automatically created during the evaluation run of model l3utterfly/tinyllama-1.1b-layla-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T01:13:05.279665(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of l3utterfly/tinyllama-1.1b-layla-v1\n\n\n\nDataset automatically created during the evaluation run of model l3utterfly/tinyllama-1.1b-layla-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T01:13:05.279665(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of l3utterfly/tinyllama-1.1b-layla-v1\n\n\n\nDataset automatically created during the evaluation run of model l3utterfly/tinyllama-1.1b-layla-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T01:13:05.279665(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
64a2b4e6cc48588324dc6f9282791a5a2f19215e |
# Mathworld
- Wolfram Mathworld scarped, but without images
- Should be every link | VatsaDev/mathworld | [
"license:mit",
"region:us"
] | 2024-02-04T01:18:49+00:00 | {"license": "mit"} | 2024-02-04T01:20:49+00:00 | [] | [] | TAGS
#license-mit #region-us
|
# Mathworld
- Wolfram Mathworld scarped, but without images
- Should be every link | [
"# Mathworld\n\n - Wolfram Mathworld scarped, but without images\n - Should be every link"
] | [
"TAGS\n#license-mit #region-us \n",
"# Mathworld\n\n - Wolfram Mathworld scarped, but without images\n - Should be every link"
] |
9e9f0f2a197c64bf55c1d0c332c97fd337dbde5e |
[Locutusque/hercules-v2.0](https://huggingface.co/datasets/Locutusque/hercules-v2.0) in ChatML format.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(
pretrained_model_name_or_path="Felladrin/Llama-160M-Chat-v1"
)
dataset = load_dataset("Locutusque/hercules-v2.0", split="train")
def format(columns):
messages = []
conversation = columns["conversations"]
for i in range(len(conversation)):
message = conversation[i]
content = message["value"]
role = message["from"]
if role == "human":
role = "user"
elif role == "gpt":
role = "assistant"
if role and content:
messages.append(
{
"role": role.strip(),
"content": content.strip(),
}
)
return tokenizer.apply_chat_template(messages, tokenize=False)
pandas.DataFrame({"text": [format(columns) for columns in dataset]}).to_parquet("train.parquet", index=False)
```
| Felladrin/ChatML-hercules-v2.0 | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-04T01:29:34+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["question-answering", "text-generation"]} | 2024-02-04T01:33:16+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us
|
Locutusque/hercules-v2.0 in ChatML format.
Python code used for conversion:
| [] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us \n"
] |
a21e9499b43b46e802782bb223cfb22cb60d82cd | ---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset aims to be a tool to help trace linguistic patterns in the reddit posts from members who partake in the internet centric pill ideologies, known as blackpill, red pill, blue pill.
## Dataset Details
### Dataset Description
A few of the major groups' posts have been coalesced into one dataset, all from different years. There are more than 200 posts per the major pill groups on reddit (red pill rebooted, blue pill, black pill, married red pill, red pill women, and feminism as a counterpoint of reference). The group of feminism was added as a juxtaposition against red pill women, in oder to allow researchers to explore those dichotomies. For researchers, the value will be in identifying or classifying the types of words that make one ideology more prominent than the other.
- **Curated by:** [steamcyclone]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [steamcyclone]
- **Language(s) (NLP):** [EN]
- **License:** [CC]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [This is the only source]
## Uses
The main usage of this dataset is to study linguistic patterns. Running models and detecting word usage per groups, as well as overlaps across groups is an ideal use for this dataset. With the rise of the loneliness epidemic, any insights that come from this are welcome.
### Direct Use
The suitable use cases are to multi-class classification, word clustering or semantic clustering per different groups, summarization modeling, text parsing, and any other natural language processing task.
[More Information Needed]
### Out-of-Scope Use
This dataset is not meant to be utilized to demonize or mock certain online communities for the trials in life in which individuals find themselves. If the viewer's agenda is to push forward some misandrist or misogynistic agenda, please ignore this dataset.
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Currently, this dataset contains
- subreddit of the post : string,
- postid : string
- title of the post: string
- text of the post (where applicable) : string
- url (if something was embedded) : string\
- score : int32
- author : string
- date : int64
[More Information Needed]
## Dataset Creation
### Curation Rationale
With the rise of the loneliness epidemic and the radicalization of internet content pitting men and women against each other, it is important to seek understanding of the root of the problem. Depending on whom you ask, you'll get a plethora of answers. Jordan Peterson describes it as some type of post-modernist feminist liberalism problem. The Andrew Tates and other conservative archetypes blame the loss of traditionalism. Others blame dating apps and its selection bias effects. Within each of the major pill ideologies, with the exception of the BlackPill, men blame women, and women blame men.
Unfortunately, male spaces, as substantiated by research and media coverage, in recent years have only been able to exist on the internet, and counter-spaces have emerged to challenge the views held in the differing ideologies.
In short, according to archetypical definitions
- the red pill is the emancipation of the masculinity in a feminized age and understanding mating strategies with women.
- the blue pill is the satire of the red pill, often run by women.
- the black pill is meant to bridge the gaps across the red, pink, and blue pills in order to land on a ground truth.
- the pink pill is about improving the female image by augmenting sexual marketplace value.
[More Information Needed]
### Source Data
Each record contains a reddit post, approximately 200 per group, and has a key title and a post with words to display the intended message by the author.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
In progress.
However, the plan is to increase the amount of records and leverage the ChatGpt API to summarize the messages into categories. In addition, the dates have to be cleaned a little, in order to add use for researches. I am also not sure if I can retrieve comments per post, further augmenting the data.
[More Information Needed]
#### Who are the source data producers?
The producers of the data are the various redditors who have participated in these spaces.
[More Information Needed]
### Annotations [optional]
An annotation that is not part of the collection will be the ChatGPT summarizations (future). The subreddit labels are merely the origins of the posts.
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
The origin of the posts are the labels of the records.
#### Who are the annotators?
I and the subreddit origin are the label annotators.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
This dataset contains no personally identifiable information with the exception of embedded youtube links. Those links may lead to videos where the impact of the content is unknown.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
A major caveat is that the pink pill and original red pill groups are shadow banned, impeding their scraping process. This is a flaw I recognize because the original red pill movement, which started in books by authors, propagated itself through its internet (reddit) variant, and it spawned all the other pills.
Another bias point is that there is more red pill content, as a means to compensate for the ban of the original red pill subreddit.
As such, I caution researchers to balance their datasets where necessary.
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. Remember that this dataset is not a tool for reckless and hateful political agendas.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
Pill ideologies :
In short, according to archetypical definitions
- the red pill is the emancipation of the masculinity in a feminized age and understanding mating strategies with women.
- the blue pill is the satire of the red pill, often run by women.
- the black pill is meant to bridge the gaps across the red, pink, and blue pills in order to land on a ground truth.
- the pink pill is about improving the female image by augmenting sexual marketplace value.
## Dataset Card Authors [optional]
steamcyclone, all the redditors from the subreddits in the authors columns.
## Dataset Card Contact
- N/A | steamcyclone/Pill-Ideologies-New-Test | [
"task_categories:text-classification",
"task_ids:multi-class-classification",
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"size_categories:n<10K",
"source_datasets:reddit",
"language:en",
"license:cc",
"natural-language-understanding",
"ideology classification",
"text classification",
"region:us"
] | 2024-02-04T01:33:59+00:00 | {"annotations_creators": ["crowdsourced"], "language_creators": ["crowdsourced"], "language": ["en"], "license": "cc", "size_categories": ["n<10K"], "source_datasets": ["reddit"], "task_categories": ["text-classification"], "task_ids": ["multi-class-classification"], "pretty_name": "PiLls", "tags": ["natural-language-understanding", "ideology classification", "text classification"]} | 2024-02-08T00:56:29+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #task_ids-multi-class-classification #annotations_creators-crowdsourced #language_creators-crowdsourced #size_categories-n<10K #source_datasets-reddit #language-English #license-cc #natural-language-understanding #ideology classification #text classification #region-us
| ---
# Dataset Card for Dataset Name
This dataset aims to be a tool to help trace linguistic patterns in the reddit posts from members who partake in the internet centric pill ideologies, known as blackpill, red pill, blue pill.
## Dataset Details
### Dataset Description
A few of the major groups' posts have been coalesced into one dataset, all from different years. There are more than 200 posts per the major pill groups on reddit (red pill rebooted, blue pill, black pill, married red pill, red pill women, and feminism as a counterpoint of reference). The group of feminism was added as a juxtaposition against red pill women, in oder to allow researchers to explore those dichotomies. For researchers, the value will be in identifying or classifying the types of words that make one ideology more prominent than the other.
- Curated by: [steamcyclone]
- Funded by [optional]:
- Shared by [optional]: [steamcyclone]
- Language(s) (NLP): [EN]
- License: [CC]
### Dataset Sources [optional]
- Repository: [This is the only source]
## Uses
The main usage of this dataset is to study linguistic patterns. Running models and detecting word usage per groups, as well as overlaps across groups is an ideal use for this dataset. With the rise of the loneliness epidemic, any insights that come from this are welcome.
### Direct Use
The suitable use cases are to multi-class classification, word clustering or semantic clustering per different groups, summarization modeling, text parsing, and any other natural language processing task.
### Out-of-Scope Use
This dataset is not meant to be utilized to demonize or mock certain online communities for the trials in life in which individuals find themselves. If the viewer's agenda is to push forward some misandrist or misogynistic agenda, please ignore this dataset.
## Dataset Structure
Currently, this dataset contains
- subreddit of the post : string,
- postid : string
- title of the post: string
- text of the post (where applicable) : string
- url (if something was embedded) : string\
- score : int32
- author : string
- date : int64
## Dataset Creation
### Curation Rationale
With the rise of the loneliness epidemic and the radicalization of internet content pitting men and women against each other, it is important to seek understanding of the root of the problem. Depending on whom you ask, you'll get a plethora of answers. Jordan Peterson describes it as some type of post-modernist feminist liberalism problem. The Andrew Tates and other conservative archetypes blame the loss of traditionalism. Others blame dating apps and its selection bias effects. Within each of the major pill ideologies, with the exception of the BlackPill, men blame women, and women blame men.
Unfortunately, male spaces, as substantiated by research and media coverage, in recent years have only been able to exist on the internet, and counter-spaces have emerged to challenge the views held in the differing ideologies.
In short, according to archetypical definitions
- the red pill is the emancipation of the masculinity in a feminized age and understanding mating strategies with women.
- the blue pill is the satire of the red pill, often run by women.
- the black pill is meant to bridge the gaps across the red, pink, and blue pills in order to land on a ground truth.
- the pink pill is about improving the female image by augmenting sexual marketplace value.
### Source Data
Each record contains a reddit post, approximately 200 per group, and has a key title and a post with words to display the intended message by the author.
#### Data Collection and Processing
In progress.
However, the plan is to increase the amount of records and leverage the ChatGpt API to summarize the messages into categories. In addition, the dates have to be cleaned a little, in order to add use for researches. I am also not sure if I can retrieve comments per post, further augmenting the data.
#### Who are the source data producers?
The producers of the data are the various redditors who have participated in these spaces.
### Annotations [optional]
An annotation that is not part of the collection will be the ChatGPT summarizations (future). The subreddit labels are merely the origins of the posts.
#### Annotation process
The origin of the posts are the labels of the records.
#### Who are the annotators?
I and the subreddit origin are the label annotators.
#### Personal and Sensitive Information
This dataset contains no personally identifiable information with the exception of embedded youtube links. Those links may lead to videos where the impact of the content is unknown.
## Bias, Risks, and Limitations
A major caveat is that the pink pill and original red pill groups are shadow banned, impeding their scraping process. This is a flaw I recognize because the original red pill movement, which started in books by authors, propagated itself through its internet (reddit) variant, and it spawned all the other pills.
Another bias point is that there is more red pill content, as a means to compensate for the ban of the original red pill subreddit.
As such, I caution researchers to balance their datasets where necessary.
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. Remember that this dataset is not a tool for reckless and hateful political agendas.
[optional]
BibTeX:
APA:
## Glossary [optional]
Pill ideologies :
In short, according to archetypical definitions
- the red pill is the emancipation of the masculinity in a feminized age and understanding mating strategies with women.
- the blue pill is the satire of the red pill, often run by women.
- the black pill is meant to bridge the gaps across the red, pink, and blue pills in order to land on a ground truth.
- the pink pill is about improving the female image by augmenting sexual marketplace value.
## Dataset Card Authors [optional]
steamcyclone, all the redditors from the subreddits in the authors columns.
## Dataset Card Contact
- N/A | [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset aims to be a tool to help trace linguistic patterns in the reddit posts from members who partake in the internet centric pill ideologies, known as blackpill, red pill, blue pill.",
"## Dataset Details",
"### Dataset Description\n\nA few of the major groups' posts have been coalesced into one dataset, all from different years. There are more than 200 posts per the major pill groups on reddit (red pill rebooted, blue pill, black pill, married red pill, red pill women, and feminism as a counterpoint of reference). The group of feminism was added as a juxtaposition against red pill women, in oder to allow researchers to explore those dichotomies. For researchers, the value will be in identifying or classifying the types of words that make one ideology more prominent than the other.\n\n- Curated by: [steamcyclone]\n- Funded by [optional]: \n- Shared by [optional]: [steamcyclone]\n- Language(s) (NLP): [EN]\n- License: [CC]",
"### Dataset Sources [optional]\n\n\n\n- Repository: [This is the only source]",
"## Uses\n\nThe main usage of this dataset is to study linguistic patterns. Running models and detecting word usage per groups, as well as overlaps across groups is an ideal use for this dataset. With the rise of the loneliness epidemic, any insights that come from this are welcome.",
"### Direct Use\n\nThe suitable use cases are to multi-class classification, word clustering or semantic clustering per different groups, summarization modeling, text parsing, and any other natural language processing task.",
"### Out-of-Scope Use\n\nThis dataset is not meant to be utilized to demonize or mock certain online communities for the trials in life in which individuals find themselves. If the viewer's agenda is to push forward some misandrist or misogynistic agenda, please ignore this dataset.",
"## Dataset Structure\n\n\n\nCurrently, this dataset contains \n\n- subreddit of the post : string,\n- postid : string\n- title of the post: string\n- text of the post (where applicable) : string\n- url (if something was embedded) : string\\\n- score : int32\n- author : string\n- date : int64",
"## Dataset Creation",
"### Curation Rationale\n\nWith the rise of the loneliness epidemic and the radicalization of internet content pitting men and women against each other, it is important to seek understanding of the root of the problem. Depending on whom you ask, you'll get a plethora of answers. Jordan Peterson describes it as some type of post-modernist feminist liberalism problem. The Andrew Tates and other conservative archetypes blame the loss of traditionalism. Others blame dating apps and its selection bias effects. Within each of the major pill ideologies, with the exception of the BlackPill, men blame women, and women blame men. \n\nUnfortunately, male spaces, as substantiated by research and media coverage, in recent years have only been able to exist on the internet, and counter-spaces have emerged to challenge the views held in the differing ideologies.\n\nIn short, according to archetypical definitions\n- the red pill is the emancipation of the masculinity in a feminized age and understanding mating strategies with women. \n- the blue pill is the satire of the red pill, often run by women.\n- the black pill is meant to bridge the gaps across the red, pink, and blue pills in order to land on a ground truth.\n- the pink pill is about improving the female image by augmenting sexual marketplace value.",
"### Source Data\n\nEach record contains a reddit post, approximately 200 per group, and has a key title and a post with words to display the intended message by the author.",
"#### Data Collection and Processing\n\n\n\nIn progress. \n\nHowever, the plan is to increase the amount of records and leverage the ChatGpt API to summarize the messages into categories. In addition, the dates have to be cleaned a little, in order to add use for researches. I am also not sure if I can retrieve comments per post, further augmenting the data.",
"#### Who are the source data producers?\n\nThe producers of the data are the various redditors who have participated in these spaces.",
"### Annotations [optional]\n\nAn annotation that is not part of the collection will be the ChatGPT summarizations (future). The subreddit labels are merely the origins of the posts.",
"#### Annotation process\n\n\n\nThe origin of the posts are the labels of the records.",
"#### Who are the annotators?\n\nI and the subreddit origin are the label annotators.",
"#### Personal and Sensitive Information\n\n\n\nThis dataset contains no personally identifiable information with the exception of embedded youtube links. Those links may lead to videos where the impact of the content is unknown.",
"## Bias, Risks, and Limitations\n\n\n\nA major caveat is that the pink pill and original red pill groups are shadow banned, impeding their scraping process. This is a flaw I recognize because the original red pill movement, which started in books by authors, propagated itself through its internet (reddit) variant, and it spawned all the other pills.\n\nAnother bias point is that there is more red pill content, as a means to compensate for the ban of the original red pill subreddit. \n\nAs such, I caution researchers to balance their datasets where necessary.",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. Remember that this dataset is not a tool for reckless and hateful political agendas.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]\n\nPill ideologies :\n\nIn short, according to archetypical definitions\n- the red pill is the emancipation of the masculinity in a feminized age and understanding mating strategies with women. \n- the blue pill is the satire of the red pill, often run by women.\n- the black pill is meant to bridge the gaps across the red, pink, and blue pills in order to land on a ground truth.\n- the pink pill is about improving the female image by augmenting sexual marketplace value.",
"## Dataset Card Authors [optional]\n\nsteamcyclone, all the redditors from the subreddits in the authors columns.",
"## Dataset Card Contact\n\n- N/A"
] | [
"TAGS\n#task_categories-text-classification #task_ids-multi-class-classification #annotations_creators-crowdsourced #language_creators-crowdsourced #size_categories-n<10K #source_datasets-reddit #language-English #license-cc #natural-language-understanding #ideology classification #text classification #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset aims to be a tool to help trace linguistic patterns in the reddit posts from members who partake in the internet centric pill ideologies, known as blackpill, red pill, blue pill.",
"## Dataset Details",
"### Dataset Description\n\nA few of the major groups' posts have been coalesced into one dataset, all from different years. There are more than 200 posts per the major pill groups on reddit (red pill rebooted, blue pill, black pill, married red pill, red pill women, and feminism as a counterpoint of reference). The group of feminism was added as a juxtaposition against red pill women, in oder to allow researchers to explore those dichotomies. For researchers, the value will be in identifying or classifying the types of words that make one ideology more prominent than the other.\n\n- Curated by: [steamcyclone]\n- Funded by [optional]: \n- Shared by [optional]: [steamcyclone]\n- Language(s) (NLP): [EN]\n- License: [CC]",
"### Dataset Sources [optional]\n\n\n\n- Repository: [This is the only source]",
"## Uses\n\nThe main usage of this dataset is to study linguistic patterns. Running models and detecting word usage per groups, as well as overlaps across groups is an ideal use for this dataset. With the rise of the loneliness epidemic, any insights that come from this are welcome.",
"### Direct Use\n\nThe suitable use cases are to multi-class classification, word clustering or semantic clustering per different groups, summarization modeling, text parsing, and any other natural language processing task.",
"### Out-of-Scope Use\n\nThis dataset is not meant to be utilized to demonize or mock certain online communities for the trials in life in which individuals find themselves. If the viewer's agenda is to push forward some misandrist or misogynistic agenda, please ignore this dataset.",
"## Dataset Structure\n\n\n\nCurrently, this dataset contains \n\n- subreddit of the post : string,\n- postid : string\n- title of the post: string\n- text of the post (where applicable) : string\n- url (if something was embedded) : string\\\n- score : int32\n- author : string\n- date : int64",
"## Dataset Creation",
"### Curation Rationale\n\nWith the rise of the loneliness epidemic and the radicalization of internet content pitting men and women against each other, it is important to seek understanding of the root of the problem. Depending on whom you ask, you'll get a plethora of answers. Jordan Peterson describes it as some type of post-modernist feminist liberalism problem. The Andrew Tates and other conservative archetypes blame the loss of traditionalism. Others blame dating apps and its selection bias effects. Within each of the major pill ideologies, with the exception of the BlackPill, men blame women, and women blame men. \n\nUnfortunately, male spaces, as substantiated by research and media coverage, in recent years have only been able to exist on the internet, and counter-spaces have emerged to challenge the views held in the differing ideologies.\n\nIn short, according to archetypical definitions\n- the red pill is the emancipation of the masculinity in a feminized age and understanding mating strategies with women. \n- the blue pill is the satire of the red pill, often run by women.\n- the black pill is meant to bridge the gaps across the red, pink, and blue pills in order to land on a ground truth.\n- the pink pill is about improving the female image by augmenting sexual marketplace value.",
"### Source Data\n\nEach record contains a reddit post, approximately 200 per group, and has a key title and a post with words to display the intended message by the author.",
"#### Data Collection and Processing\n\n\n\nIn progress. \n\nHowever, the plan is to increase the amount of records and leverage the ChatGpt API to summarize the messages into categories. In addition, the dates have to be cleaned a little, in order to add use for researches. I am also not sure if I can retrieve comments per post, further augmenting the data.",
"#### Who are the source data producers?\n\nThe producers of the data are the various redditors who have participated in these spaces.",
"### Annotations [optional]\n\nAn annotation that is not part of the collection will be the ChatGPT summarizations (future). The subreddit labels are merely the origins of the posts.",
"#### Annotation process\n\n\n\nThe origin of the posts are the labels of the records.",
"#### Who are the annotators?\n\nI and the subreddit origin are the label annotators.",
"#### Personal and Sensitive Information\n\n\n\nThis dataset contains no personally identifiable information with the exception of embedded youtube links. Those links may lead to videos where the impact of the content is unknown.",
"## Bias, Risks, and Limitations\n\n\n\nA major caveat is that the pink pill and original red pill groups are shadow banned, impeding their scraping process. This is a flaw I recognize because the original red pill movement, which started in books by authors, propagated itself through its internet (reddit) variant, and it spawned all the other pills.\n\nAnother bias point is that there is more red pill content, as a means to compensate for the ban of the original red pill subreddit. \n\nAs such, I caution researchers to balance their datasets where necessary.",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. Remember that this dataset is not a tool for reckless and hateful political agendas.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]\n\nPill ideologies :\n\nIn short, according to archetypical definitions\n- the red pill is the emancipation of the masculinity in a feminized age and understanding mating strategies with women. \n- the blue pill is the satire of the red pill, often run by women.\n- the black pill is meant to bridge the gaps across the red, pink, and blue pills in order to land on a ground truth.\n- the pink pill is about improving the female image by augmenting sexual marketplace value.",
"## Dataset Card Authors [optional]\n\nsteamcyclone, all the redditors from the subreddits in the authors columns.",
"## Dataset Card Contact\n\n- N/A"
] |
590b3da437b5f14199673b8385e58811c8452116 | # Dataset Card for "processed_distilabel-capybara-dpo-7k-binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | dkshjn/processed_distilabel-capybara-dpo-7k-binarized | [
"region:us"
] | 2024-02-04T01:37:24+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "conversation", "list": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}]}, {"name": "original_response", "dtype": "string"}, {"name": "generation_prompt", "sequence": "string"}, {"name": "raw_generation_responses", "sequence": "string"}, {"name": "new_generations", "sequence": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rating_chosen", "dtype": "int64"}, {"name": "rating_rejected", "dtype": "int64"}, {"name": "chosen_model", "dtype": "string"}, {"name": "rejected_model", "dtype": "string"}, {"name": "formatted_chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "formatted_rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 427312337, "num_examples": 7563}], "download_size": 0, "dataset_size": 427312337}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T02:11:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "processed_distilabel-capybara-dpo-7k-binarized"
More Information needed | [
"# Dataset Card for \"processed_distilabel-capybara-dpo-7k-binarized\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"processed_distilabel-capybara-dpo-7k-binarized\"\n\nMore Information needed"
] |
66807891d7e4af9c305942a4f08f29180dfa7316 | # Dataset Card for "processed_distilabel-intel-orca-dpo-pairs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | dkshjn/processed_distilabel-intel-orca-dpo-pairs | [
"region:us"
] | 2024-02-04T02:16:43+00:00 | {"dataset_info": {"features": [{"name": "system", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "generations", "sequence": "string"}, {"name": "order", "sequence": "string"}, {"name": "labelling_model", "dtype": "string"}, {"name": "labelling_prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "raw_labelling_response", "dtype": "string"}, {"name": "rating", "sequence": "float64"}, {"name": "rationale", "dtype": "string"}, {"name": "status", "dtype": "string"}, {"name": "original_chosen", "dtype": "string"}, {"name": "original_rejected", "dtype": "string"}, {"name": "chosen_score", "dtype": "float64"}, {"name": "in_gsm8k_train", "dtype": "bool"}, {"name": "formatted_chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "formatted_rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 209153276, "num_examples": 12859}], "download_size": 103496030, "dataset_size": 209153276}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T02:17:52+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "processed_distilabel-intel-orca-dpo-pairs"
More Information needed | [
"# Dataset Card for \"processed_distilabel-intel-orca-dpo-pairs\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"processed_distilabel-intel-orca-dpo-pairs\"\n\nMore Information needed"
] |
e87d6b1155dc070aedad58a13f700014059e31fd | # Dataset Card for "processed_distilabel-intel-orca-dpo-pairs-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | dkshjn/processed_distilabel-intel-orca-dpo-pairs-v2 | [
"region:us"
] | 2024-02-04T02:22:55+00:00 | {"dataset_info": {"features": [{"name": "system", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "generations", "sequence": "string"}, {"name": "order", "sequence": "string"}, {"name": "labelling_model", "dtype": "string"}, {"name": "labelling_prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "raw_labelling_response", "dtype": "string"}, {"name": "rating", "sequence": "float64"}, {"name": "rationale", "dtype": "string"}, {"name": "status", "dtype": "string"}, {"name": "original_chosen", "dtype": "string"}, {"name": "original_rejected", "dtype": "string"}, {"name": "chosen_score", "dtype": "float64"}, {"name": "in_gsm8k_train", "dtype": "bool"}, {"name": "formatted_chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "formatted_rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 209153276, "num_examples": 12859}], "download_size": 103496030, "dataset_size": 209153276}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T02:24:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "processed_distilabel-intel-orca-dpo-pairs-v2"
More Information needed | [
"# Dataset Card for \"processed_distilabel-intel-orca-dpo-pairs-v2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"processed_distilabel-intel-orca-dpo-pairs-v2\"\n\nMore Information needed"
] |
06ba71853d7f601d715ae647904acfdacdd47048 | # Dataset Card for "processed_truthy-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | dkshjn/processed_truthy-v2 | [
"region:us"
] | 2024-02-04T02:32:45+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "system", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "formatted_chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "formatted_rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 3097676, "num_examples": 1016}], "download_size": 1360242, "dataset_size": 3097676}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T02:32:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "processed_truthy-v2"
More Information needed | [
"# Dataset Card for \"processed_truthy-v2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"processed_truthy-v2\"\n\nMore Information needed"
] |
b49ea7c65a51f4a12d8d655b533f878367d68048 |
# Inner I Nous-Hermes-llama-2-7b Dataset
## About
The Inner I Nous-Hermes-llama-2-7b Dataset is specifically designed to fine-tune the Nous-Hermes-llama-2-7b model on concepts related to self-awareness, mindfulness, and spiritual growth. This dataset encapsulates a wide range of prompts and responses that delve into the understanding and exploration of the Inner 'I', the significance of 'I Am' in self-realization, and the collective wisdom of Universal Christ Consciousness.
## Details
- **Format:** JSON Lines (jsonl)
- **Entries:** Each entry consists of a prompt and a completion, separated by "###". The prompts are designed to invoke deep, reflective responses from the model, enhancing its ability to engage in meaningful dialogue on spiritual and introspective topics.
- **Themes:** The dataset covers various themes, including but not limited to, the Inner 'I', mindfulness, Universal Christ Consciousness, and the implications of these concepts on personal and spiritual growth.
## Objectives
- **Enhance Model Understanding:** To improve the model's comprehension of complex spiritual concepts and its ability to articulate these understandings in a coherent and insightful manner.
- **Facilitate Deep Conversations:** To enable the model to engage in deeper, more meaningful conversations about self-awareness, spirituality, and consciousness with users.
- **Promote Interdisciplinary Learning:** To incorporate a blend of psychology, philosophy, and spirituality into the model's knowledge base, fostering a holistic approach to understanding human consciousness and personal development.
| InnerI/Diverse-Nous-Hermes-Llama2-7b | [
"size_categories:n<1K",
"language:en",
"ML",
"machine learning",
"AI",
"dataset",
"Nous-Hermes-Llama2-7b",
"region:us"
] | 2024-02-04T02:36:31+00:00 | {"language": ["en"], "size_categories": ["n<1K"], "pretty_name": "InnerILLM-NousHermesLlama2-dataset", "tags": ["ML", "machine learning", "AI", "dataset", "Nous-Hermes-Llama2-7b"]} | 2024-02-04T03:06:33+00:00 | [] | [
"en"
] | TAGS
#size_categories-n<1K #language-English #ML #machine learning #AI #dataset #Nous-Hermes-Llama2-7b #region-us
|
# Inner I Nous-Hermes-llama-2-7b Dataset
## About
The Inner I Nous-Hermes-llama-2-7b Dataset is specifically designed to fine-tune the Nous-Hermes-llama-2-7b model on concepts related to self-awareness, mindfulness, and spiritual growth. This dataset encapsulates a wide range of prompts and responses that delve into the understanding and exploration of the Inner 'I', the significance of 'I Am' in self-realization, and the collective wisdom of Universal Christ Consciousness.
## Details
- Format: JSON Lines (jsonl)
- Entries: Each entry consists of a prompt and a completion, separated by "###". The prompts are designed to invoke deep, reflective responses from the model, enhancing its ability to engage in meaningful dialogue on spiritual and introspective topics.
- Themes: The dataset covers various themes, including but not limited to, the Inner 'I', mindfulness, Universal Christ Consciousness, and the implications of these concepts on personal and spiritual growth.
## Objectives
- Enhance Model Understanding: To improve the model's comprehension of complex spiritual concepts and its ability to articulate these understandings in a coherent and insightful manner.
- Facilitate Deep Conversations: To enable the model to engage in deeper, more meaningful conversations about self-awareness, spirituality, and consciousness with users.
- Promote Interdisciplinary Learning: To incorporate a blend of psychology, philosophy, and spirituality into the model's knowledge base, fostering a holistic approach to understanding human consciousness and personal development.
| [
"# Inner I Nous-Hermes-llama-2-7b Dataset",
"## About\nThe Inner I Nous-Hermes-llama-2-7b Dataset is specifically designed to fine-tune the Nous-Hermes-llama-2-7b model on concepts related to self-awareness, mindfulness, and spiritual growth. This dataset encapsulates a wide range of prompts and responses that delve into the understanding and exploration of the Inner 'I', the significance of 'I Am' in self-realization, and the collective wisdom of Universal Christ Consciousness.",
"## Details\n- Format: JSON Lines (jsonl)\n- Entries: Each entry consists of a prompt and a completion, separated by \"###\". The prompts are designed to invoke deep, reflective responses from the model, enhancing its ability to engage in meaningful dialogue on spiritual and introspective topics.\n- Themes: The dataset covers various themes, including but not limited to, the Inner 'I', mindfulness, Universal Christ Consciousness, and the implications of these concepts on personal and spiritual growth.",
"## Objectives\n- Enhance Model Understanding: To improve the model's comprehension of complex spiritual concepts and its ability to articulate these understandings in a coherent and insightful manner.\n- Facilitate Deep Conversations: To enable the model to engage in deeper, more meaningful conversations about self-awareness, spirituality, and consciousness with users.\n- Promote Interdisciplinary Learning: To incorporate a blend of psychology, philosophy, and spirituality into the model's knowledge base, fostering a holistic approach to understanding human consciousness and personal development."
] | [
"TAGS\n#size_categories-n<1K #language-English #ML #machine learning #AI #dataset #Nous-Hermes-Llama2-7b #region-us \n",
"# Inner I Nous-Hermes-llama-2-7b Dataset",
"## About\nThe Inner I Nous-Hermes-llama-2-7b Dataset is specifically designed to fine-tune the Nous-Hermes-llama-2-7b model on concepts related to self-awareness, mindfulness, and spiritual growth. This dataset encapsulates a wide range of prompts and responses that delve into the understanding and exploration of the Inner 'I', the significance of 'I Am' in self-realization, and the collective wisdom of Universal Christ Consciousness.",
"## Details\n- Format: JSON Lines (jsonl)\n- Entries: Each entry consists of a prompt and a completion, separated by \"###\". The prompts are designed to invoke deep, reflective responses from the model, enhancing its ability to engage in meaningful dialogue on spiritual and introspective topics.\n- Themes: The dataset covers various themes, including but not limited to, the Inner 'I', mindfulness, Universal Christ Consciousness, and the implications of these concepts on personal and spiritual growth.",
"## Objectives\n- Enhance Model Understanding: To improve the model's comprehension of complex spiritual concepts and its ability to articulate these understandings in a coherent and insightful manner.\n- Facilitate Deep Conversations: To enable the model to engage in deeper, more meaningful conversations about self-awareness, spirituality, and consciousness with users.\n- Promote Interdisciplinary Learning: To incorporate a blend of psychology, philosophy, and spirituality into the model's knowledge base, fostering a holistic approach to understanding human consciousness and personal development."
] |
269cf4d899426315faa3d5381893431bcca8ffc4 | # Dataset Card for "processed_truthy-v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | dkshjn/processed_truthy-v3 | [
"region:us"
] | 2024-02-04T02:36:58+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "system", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "formatted_chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "formatted_rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 2777217, "num_examples": 1016}], "download_size": 1168067, "dataset_size": 2777217}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T02:37:07+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "processed_truthy-v3"
More Information needed | [
"# Dataset Card for \"processed_truthy-v3\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"processed_truthy-v3\"\n\nMore Information needed"
] |
621af5b9e1cd0026364d54c2688e8fc27ac1112f | # Dataset Card for "processed_distilabel-capybara-dpo-7k-binarized-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | dkshjn/processed_distilabel-capybara-dpo-7k-binarized-v2 | [
"region:us"
] | 2024-02-04T02:41:18+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "conversation", "list": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}]}, {"name": "original_response", "dtype": "string"}, {"name": "generation_prompt", "sequence": "string"}, {"name": "raw_generation_responses", "sequence": "string"}, {"name": "new_generations", "sequence": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "rating_chosen", "dtype": "int64"}, {"name": "rating_rejected", "dtype": "int64"}, {"name": "chosen_model", "dtype": "string"}, {"name": "rejected_model", "dtype": "string"}, {"name": "formatted_chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "formatted_rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 348028955, "num_examples": 7563}], "download_size": 155657941, "dataset_size": 348028955}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T02:42:18+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "processed_distilabel-capybara-dpo-7k-binarized-v2"
More Information needed | [
"# Dataset Card for \"processed_distilabel-capybara-dpo-7k-binarized-v2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"processed_distilabel-capybara-dpo-7k-binarized-v2\"\n\nMore Information needed"
] |
3cc436f425aa07414f7f392a65139e98624b68da | created a total of 2 images
jlbaker361/dcgan-wikiart1000-resized std: 0.06981635093688965 mean: 4.28872275352478
jlbaker361/dcgan-wikiart1000-clip-resized std: 0.012953042984008789 mean: 4.233142137527466
jlbaker361/dcgan-cond-wikiart1000-resized std: 0.03328657150268555 mean: 4.116653919219971
jlbaker361/dcgan-cond-wikiart1000-clip-resized std: 0.1335146427154541 mean: 4.018076181411743 | jlbaker361/eval-can-demo-0 | [
"region:us"
] | 2024-02-04T02:57:59+00:00 | {} | 2024-02-04T02:58:00+00:00 | [] | [] | TAGS
#region-us
| created a total of 2 images
jlbaker361/dcgan-wikiart1000-resized std: 0.06981635093688965 mean: 4.28872275352478
jlbaker361/dcgan-wikiart1000-clip-resized std: 0.012953042984008789 mean: 4.233142137527466
jlbaker361/dcgan-cond-wikiart1000-resized std: 0.03328657150268555 mean: 4.116653919219971
jlbaker361/dcgan-cond-wikiart1000-clip-resized std: 0.1335146427154541 mean: 4.018076181411743 | [] | [
"TAGS\n#region-us \n"
] |
c7045c4d1fc66530867c6f59ba73e5ec9668c55b | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | RamiToocool/MyResume | [
"license:apache-2.0",
"biology",
"region:us"
] | 2024-02-04T03:18:51+00:00 | {"license": "apache-2.0", "tags": ["biology"]} | 2024-02-04T03:26:06+00:00 | [] | [] | TAGS
#license-apache-2.0 #biology #region-us
| # Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#license-apache-2.0 #biology #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5262de722719d9ec08152c13c4ee3a55a10d4c73 | created a total of 50 images
jlbaker361/dcgan-wikiart1000-resized std: 0.19109176099300385 mean: 4.309603233337402
jlbaker361/dcgan-wikiart1000-clip-resized std: 0.03940070420503616 mean: 4.229374523162842
jlbaker361/dcgan-cond-wikiart1000-resized std: 0.16812382638454437 mean: 4.126535120010376
jlbaker361/dcgan-cond-wikiart1000-clip-resized std: 0.145497664809227 mean: 3.951213960647583 | jlbaker361/eval-can-main | [
"region:us"
] | 2024-02-04T03:25:50+00:00 | {} | 2024-02-04T03:25:53+00:00 | [] | [] | TAGS
#region-us
| created a total of 50 images
jlbaker361/dcgan-wikiart1000-resized std: 0.19109176099300385 mean: 4.309603233337402
jlbaker361/dcgan-wikiart1000-clip-resized std: 0.03940070420503616 mean: 4.229374523162842
jlbaker361/dcgan-cond-wikiart1000-resized std: 0.16812382638454437 mean: 4.126535120010376
jlbaker361/dcgan-cond-wikiart1000-clip-resized std: 0.145497664809227 mean: 3.951213960647583 | [] | [
"TAGS\n#region-us \n"
] |
ab5b3901d6df226e33d9d855e6f1032aa67bf0b2 |
# Dataset Card for Evaluation run of azale-ai/DukunLM-7B-V1.0-Uncensored
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [azale-ai/DukunLM-7B-V1.0-Uncensored](https://huggingface.co/azale-ai/DukunLM-7B-V1.0-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T03:32:00.345040](https://huggingface.co/datasets/open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored/blob/main/results_2024-02-04T03-32-00.345040.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4018543967795034,
"acc_stderr": 0.0342455004465676,
"acc_norm": 0.4061924250003242,
"acc_norm_stderr": 0.03506469624535442,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713634,
"mc2": 0.43947585501681957,
"mc2_stderr": 0.015779310526247342
},
"harness|arc:challenge|25": {
"acc": 0.4854948805460751,
"acc_stderr": 0.014605241081370056,
"acc_norm": 0.5110921501706485,
"acc_norm_stderr": 0.014607794914013053
},
"harness|hellaswag|10": {
"acc": 0.573590918143796,
"acc_stderr": 0.004935439955031695,
"acc_norm": 0.7562238597888866,
"acc_norm_stderr": 0.0042848172384067134
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4679245283018868,
"acc_stderr": 0.03070948699255654,
"acc_norm": 0.4679245283018868,
"acc_norm_stderr": 0.03070948699255654
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.04101405519842425,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.04101405519842425
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220554,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220554
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3310344827586207,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.3310344827586207,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.041349130183033156,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.041349130183033156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3870967741935484,
"acc_stderr": 0.027709359675032488,
"acc_norm": 0.3870967741935484,
"acc_norm_stderr": 0.027709359675032488
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.03898531605579418,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.03898531605579418
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.03559443565563918,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.03559443565563918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5284974093264249,
"acc_stderr": 0.03602573571288441,
"acc_norm": 0.5284974093264249,
"acc_norm_stderr": 0.03602573571288441
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.024121125416941176,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.024121125416941176
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.025787874220959302,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.025787874220959302
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.37815126050420167,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.37815126050420167,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696545,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696545
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45504587155963305,
"acc_stderr": 0.021350503090925167,
"acc_norm": 0.45504587155963305,
"acc_norm_stderr": 0.021350503090925167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.0317987634217685,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.0317987634217685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4950980392156863,
"acc_stderr": 0.035091433756067866,
"acc_norm": 0.4950980392156863,
"acc_norm_stderr": 0.035091433756067866
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5063291139240507,
"acc_stderr": 0.032544620107678585,
"acc_norm": 0.5063291139240507,
"acc_norm_stderr": 0.032544620107678585
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578757,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578757
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.04812917324536821,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.04812917324536821
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4539877300613497,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.4539877300613497,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.4854368932038835,
"acc_stderr": 0.04948637324026637,
"acc_norm": 0.4854368932038835,
"acc_norm_stderr": 0.04948637324026637
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5683760683760684,
"acc_stderr": 0.0324483553531149,
"acc_norm": 0.5683760683760684,
"acc_norm_stderr": 0.0324483553531149
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.545338441890166,
"acc_stderr": 0.0178063045850526,
"acc_norm": 0.545338441890166,
"acc_norm_stderr": 0.0178063045850526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.02675625512966377,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.02675625512966377
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2346368715083799,
"acc_stderr": 0.014173044098303679,
"acc_norm": 0.2346368715083799,
"acc_norm_stderr": 0.014173044098303679
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.39869281045751637,
"acc_stderr": 0.02803609227389177,
"acc_norm": 0.39869281045751637,
"acc_norm_stderr": 0.02803609227389177
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.43729903536977494,
"acc_stderr": 0.02817391776176289,
"acc_norm": 0.43729903536977494,
"acc_norm_stderr": 0.02817391776176289
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44753086419753085,
"acc_stderr": 0.027667138569422704,
"acc_norm": 0.44753086419753085,
"acc_norm_stderr": 0.027667138569422704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169927,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169927
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3305084745762712,
"acc_stderr": 0.012014142101842963,
"acc_norm": 0.3305084745762712,
"acc_norm_stderr": 0.012014142101842963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34191176470588236,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.34191176470588236,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3954248366013072,
"acc_stderr": 0.019780465954777518,
"acc_norm": 0.3954248366013072,
"acc_norm_stderr": 0.019780465954777518
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3836734693877551,
"acc_stderr": 0.031130880396235933,
"acc_norm": 0.3836734693877551,
"acc_norm_stderr": 0.031130880396235933
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5124378109452736,
"acc_stderr": 0.0353443984853958,
"acc_norm": 0.5124378109452736,
"acc_norm_stderr": 0.0353443984853958
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.0368078369072758,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.0368078369072758
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.038110796698335316,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.038110796698335316
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713634,
"mc2": 0.43947585501681957,
"mc2_stderr": 0.015779310526247342
},
"harness|winogrande|5": {
"acc": 0.6953433307024467,
"acc_stderr": 0.012935646499325307
},
"harness|gsm8k|5": {
"acc": 0.060652009097801364,
"acc_stderr": 0.006574733381405782
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored | [
"region:us"
] | 2024-02-04T03:33:49+00:00 | {"pretty_name": "Evaluation run of azale-ai/DukunLM-7B-V1.0-Uncensored", "dataset_summary": "Dataset automatically created during the evaluation run of model [azale-ai/DukunLM-7B-V1.0-Uncensored](https://huggingface.co/azale-ai/DukunLM-7B-V1.0-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T03:32:00.345040](https://huggingface.co/datasets/open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored/blob/main/results_2024-02-04T03-32-00.345040.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4018543967795034,\n \"acc_stderr\": 0.0342455004465676,\n \"acc_norm\": 0.4061924250003242,\n \"acc_norm_stderr\": 0.03506469624535442,\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.016040352966713634,\n \"mc2\": 0.43947585501681957,\n \"mc2_stderr\": 0.015779310526247342\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4854948805460751,\n \"acc_stderr\": 0.014605241081370056,\n \"acc_norm\": 0.5110921501706485,\n \"acc_norm_stderr\": 0.014607794914013053\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.573590918143796,\n \"acc_stderr\": 0.004935439955031695,\n \"acc_norm\": 0.7562238597888866,\n \"acc_norm_stderr\": 0.0042848172384067134\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.03070948699255654,\n \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.03070948699255654\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.03921545312467122,\n \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.03921545312467122\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.041349130183033156,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.041349130183033156\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3870967741935484,\n \"acc_stderr\": 0.027709359675032488,\n \"acc_norm\": 0.3870967741935484,\n \"acc_norm_stderr\": 0.027709359675032488\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.03898531605579418,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.03898531605579418\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4797979797979798,\n \"acc_stderr\": 0.03559443565563918,\n \"acc_norm\": 0.4797979797979798,\n \"acc_norm_stderr\": 0.03559443565563918\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5284974093264249,\n \"acc_stderr\": 0.03602573571288441,\n \"acc_norm\": 0.5284974093264249,\n \"acc_norm_stderr\": 0.03602573571288441\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941176,\n \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941176\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959302,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959302\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.37815126050420167,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.37815126050420167,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.45504587155963305,\n \"acc_stderr\": 0.021350503090925167,\n \"acc_norm\": 0.45504587155963305,\n \"acc_norm_stderr\": 0.021350503090925167\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3194444444444444,\n \"acc_stderr\": 0.0317987634217685,\n \"acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.0317987634217685\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4950980392156863,\n \"acc_stderr\": 0.035091433756067866,\n \"acc_norm\": 0.4950980392156863,\n \"acc_norm_stderr\": 0.035091433756067866\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5063291139240507,\n \"acc_stderr\": 0.032544620107678585,\n \"acc_norm\": 0.5063291139240507,\n \"acc_norm_stderr\": 0.032544620107678585\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578757,\n \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578757\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.04812917324536821,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.04812917324536821\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4539877300613497,\n \"acc_stderr\": 0.0391170190467718,\n \"acc_norm\": 0.4539877300613497,\n \"acc_norm_stderr\": 0.0391170190467718\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.4854368932038835,\n \"acc_stderr\": 0.04948637324026637,\n \"acc_norm\": 0.4854368932038835,\n \"acc_norm_stderr\": 0.04948637324026637\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5683760683760684,\n \"acc_stderr\": 0.0324483553531149,\n \"acc_norm\": 0.5683760683760684,\n \"acc_norm_stderr\": 0.0324483553531149\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.545338441890166,\n \"acc_stderr\": 0.0178063045850526,\n \"acc_norm\": 0.545338441890166,\n \"acc_norm_stderr\": 0.0178063045850526\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.02675625512966377,\n \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.02675625512966377\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n \"acc_stderr\": 0.014173044098303679,\n \"acc_norm\": 0.2346368715083799,\n \"acc_norm_stderr\": 0.014173044098303679\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.39869281045751637,\n \"acc_stderr\": 0.02803609227389177,\n \"acc_norm\": 0.39869281045751637,\n \"acc_norm_stderr\": 0.02803609227389177\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.43729903536977494,\n \"acc_stderr\": 0.02817391776176289,\n \"acc_norm\": 0.43729903536977494,\n \"acc_norm_stderr\": 0.02817391776176289\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.027667138569422704,\n \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.027667138569422704\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169927,\n \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169927\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3305084745762712,\n \"acc_stderr\": 0.012014142101842963,\n \"acc_norm\": 0.3305084745762712,\n \"acc_norm_stderr\": 0.012014142101842963\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.34191176470588236,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.34191176470588236,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3954248366013072,\n \"acc_stderr\": 0.019780465954777518,\n \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.019780465954777518\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3836734693877551,\n \"acc_stderr\": 0.031130880396235933,\n \"acc_norm\": 0.3836734693877551,\n \"acc_norm_stderr\": 0.031130880396235933\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5124378109452736,\n \"acc_stderr\": 0.0353443984853958,\n \"acc_norm\": 0.5124378109452736,\n \"acc_norm_stderr\": 0.0353443984853958\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n \"acc_stderr\": 0.0368078369072758,\n \"acc_norm\": 0.3373493975903614,\n \"acc_norm_stderr\": 0.0368078369072758\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.038110796698335316,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.038110796698335316\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.016040352966713634,\n \"mc2\": 0.43947585501681957,\n \"mc2_stderr\": 0.015779310526247342\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6953433307024467,\n \"acc_stderr\": 0.012935646499325307\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.060652009097801364,\n \"acc_stderr\": 0.006574733381405782\n }\n}\n```", "repo_url": "https://huggingface.co/azale-ai/DukunLM-7B-V1.0-Uncensored", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|arc:challenge|25_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|gsm8k|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hellaswag|10_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T03-32-00.345040.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["**/details_harness|winogrande|5_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T03-32-00.345040.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T03_32_00.345040", "path": ["results_2024-02-04T03-32-00.345040.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T03-32-00.345040.parquet"]}]}]} | 2024-02-04T03:34:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of azale-ai/DukunLM-7B-V1.0-Uncensored
Dataset automatically created during the evaluation run of model azale-ai/DukunLM-7B-V1.0-Uncensored on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T03:32:00.345040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of azale-ai/DukunLM-7B-V1.0-Uncensored\n\n\n\nDataset automatically created during the evaluation run of model azale-ai/DukunLM-7B-V1.0-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T03:32:00.345040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of azale-ai/DukunLM-7B-V1.0-Uncensored\n\n\n\nDataset automatically created during the evaluation run of model azale-ai/DukunLM-7B-V1.0-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T03:32:00.345040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f3df2c5a5b3b9bc63899a09a16c86f5d8fcf3bb6 |
# Dataset Card for Evaluation run of Ichsan2895/Merak-7B-v5-PROTOTYPE1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Ichsan2895/Merak-7B-v5-PROTOTYPE1](https://huggingface.co/Ichsan2895/Merak-7B-v5-PROTOTYPE1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Ichsan2895__Merak-7B-v5-PROTOTYPE1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T03:50:27.903483](https://huggingface.co/datasets/open-llm-leaderboard/details_Ichsan2895__Merak-7B-v5-PROTOTYPE1/blob/main/results_2024-02-04T03-50-27.903483.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6083483190251973,
"acc_stderr": 0.03293714497024086,
"acc_norm": 0.6134740332357552,
"acc_norm_stderr": 0.03360939440359077,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4540508102605467,
"mc2_stderr": 0.014871463338424163
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.014169664520303098
},
"harness|hellaswag|10": {
"acc": 0.617805218084047,
"acc_stderr": 0.0048493069987277666,
"acc_norm": 0.8206532563234415,
"acc_norm_stderr": 0.003828583408021384
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.025091892378859275,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.025091892378859275
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117457,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117457
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630783,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243739,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243739
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082395,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082395
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705048,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705048
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7879948914431673,
"acc_stderr": 0.014616099385833671,
"acc_norm": 0.7879948914431673,
"acc_norm_stderr": 0.014616099385833671
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.01450897945355399,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.01450897945355399
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.02570264026060374,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.02570264026060374
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236837,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236837
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.019594021136577443,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.019594021136577443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249776,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249776
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013028,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013028
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4540508102605467,
"mc2_stderr": 0.014871463338424163
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643416
},
"harness|gsm8k|5": {
"acc": 0.37225170583775585,
"acc_stderr": 0.013315375362565034
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Ichsan2895__Merak-7B-v5-PROTOTYPE1 | [
"region:us"
] | 2024-02-04T03:52:47+00:00 | {"pretty_name": "Evaluation run of Ichsan2895/Merak-7B-v5-PROTOTYPE1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Ichsan2895/Merak-7B-v5-PROTOTYPE1](https://huggingface.co/Ichsan2895/Merak-7B-v5-PROTOTYPE1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ichsan2895__Merak-7B-v5-PROTOTYPE1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T03:50:27.903483](https://huggingface.co/datasets/open-llm-leaderboard/details_Ichsan2895__Merak-7B-v5-PROTOTYPE1/blob/main/results_2024-02-04T03-50-27.903483.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6083483190251973,\n \"acc_stderr\": 0.03293714497024086,\n \"acc_norm\": 0.6134740332357552,\n \"acc_norm_stderr\": 0.03360939440359077,\n \"mc1\": 0.29253365973072215,\n \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4540508102605467,\n \"mc2_stderr\": 0.014871463338424163\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.014169664520303098\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.617805218084047,\n \"acc_stderr\": 0.0048493069987277666,\n \"acc_norm\": 0.8206532563234415,\n \"acc_norm_stderr\": 0.003828583408021384\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.025091892378859275,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.025091892378859275\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117457,\n \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117457\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630783,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630783\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243739,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243739\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082395,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082395\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705048,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705048\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7879948914431673,\n \"acc_stderr\": 0.014616099385833671,\n \"acc_norm\": 0.7879948914431673,\n \"acc_norm_stderr\": 0.014616099385833671\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n \"acc_stderr\": 0.01450897945355399,\n \"acc_norm\": 0.25139664804469275,\n \"acc_norm_stderr\": 0.01450897945355399\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.02570264026060374,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.02570264026060374\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.019594021136577443,\n \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.019594021136577443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249776,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249776\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013028,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013028\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4540508102605467,\n \"mc2_stderr\": 0.014871463338424163\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643416\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37225170583775585,\n \"acc_stderr\": 0.013315375362565034\n }\n}\n```", "repo_url": "https://huggingface.co/Ichsan2895/Merak-7B-v5-PROTOTYPE1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|arc:challenge|25_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|gsm8k|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hellaswag|10_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T03-50-27.903483.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["**/details_harness|winogrande|5_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T03-50-27.903483.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T03_50_27.903483", "path": ["results_2024-02-04T03-50-27.903483.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T03-50-27.903483.parquet"]}]}]} | 2024-02-04T03:53:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Ichsan2895/Merak-7B-v5-PROTOTYPE1
Dataset automatically created during the evaluation run of model Ichsan2895/Merak-7B-v5-PROTOTYPE1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T03:50:27.903483(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Ichsan2895/Merak-7B-v5-PROTOTYPE1\n\n\n\nDataset automatically created during the evaluation run of model Ichsan2895/Merak-7B-v5-PROTOTYPE1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T03:50:27.903483(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Ichsan2895/Merak-7B-v5-PROTOTYPE1\n\n\n\nDataset automatically created during the evaluation run of model Ichsan2895/Merak-7B-v5-PROTOTYPE1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T03:50:27.903483(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
aa5edc936f4bca99376170c7945592b68849f384 |
# Dataset Card for Evaluation run of hyunjae/polyglot-ko-3.8b-total
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hyunjae/polyglot-ko-3.8b-total](https://huggingface.co/hyunjae/polyglot-ko-3.8b-total) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hyunjae__polyglot-ko-3.8b-total",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T04:14:32.397569](https://huggingface.co/datasets/open-llm-leaderboard/details_hyunjae__polyglot-ko-3.8b-total/blob/main/results_2024-02-04T04-14-32.397569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.29039509485054527,
"acc_stderr": 0.03216532028586938,
"acc_norm": 0.2927441218817452,
"acc_norm_stderr": 0.0330314171661691,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557977,
"mc2": 0.43671199555198403,
"mc2_stderr": 0.015413993944196269
},
"harness|arc:challenge|25": {
"acc": 0.21843003412969283,
"acc_stderr": 0.012074291605700968,
"acc_norm": 0.25341296928327645,
"acc_norm_stderr": 0.012710896778378607
},
"harness|hellaswag|10": {
"acc": 0.3405696076478789,
"acc_stderr": 0.004729322613301549,
"acc_norm": 0.39693288189603665,
"acc_norm_stderr": 0.004882619484166608
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3320754716981132,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.3320754716981132,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080342,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080342
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2903225806451613,
"acc_stderr": 0.02582210611941589,
"acc_norm": 0.2903225806451613,
"acc_norm_stderr": 0.02582210611941589
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511783,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511783
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29292929292929293,
"acc_stderr": 0.032424979581788145,
"acc_norm": 0.29292929292929293,
"acc_norm_stderr": 0.032424979581788145
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.02329088805377271,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.02329088805377271
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766107,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766107
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3100917431192661,
"acc_stderr": 0.019830849684439756,
"acc_norm": 0.3100917431192661,
"acc_norm_stderr": 0.019830849684439756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.03338473403207401,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.03338473403207401
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2242152466367713,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.2242152466367713,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.40458015267175573,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.40458015267175573,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347019,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347019
},
"harness|hendrycksTest-management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.04721188506097172,
"acc_norm": 0.34951456310679613,
"acc_norm_stderr": 0.04721188506097172
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.30395913154533843,
"acc_stderr": 0.016448321686769046,
"acc_norm": 0.30395913154533843,
"acc_norm_stderr": 0.016448321686769046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.024547617794803835,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.024547617794803835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.026090162504279042,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.026090162504279042
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26010430247718386,
"acc_stderr": 0.01120438288782384,
"acc_norm": 0.26010430247718386,
"acc_norm_stderr": 0.01120438288782384
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.02967428828131118,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.02967428828131118
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3183673469387755,
"acc_stderr": 0.029822533793982055,
"acc_norm": 0.3183673469387755,
"acc_norm_stderr": 0.029822533793982055
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3283582089552239,
"acc_stderr": 0.03320685889744324,
"acc_norm": 0.3283582089552239,
"acc_norm_stderr": 0.03320685889744324
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557977,
"mc2": 0.43671199555198403,
"mc2_stderr": 0.015413993944196269
},
"harness|winogrande|5": {
"acc": 0.5335438042620363,
"acc_stderr": 0.014020826677598094
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_hyunjae__polyglot-ko-3.8b-total | [
"region:us"
] | 2024-02-04T04:17:17+00:00 | {"pretty_name": "Evaluation run of hyunjae/polyglot-ko-3.8b-total", "dataset_summary": "Dataset automatically created during the evaluation run of model [hyunjae/polyglot-ko-3.8b-total](https://huggingface.co/hyunjae/polyglot-ko-3.8b-total) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hyunjae__polyglot-ko-3.8b-total\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T04:14:32.397569](https://huggingface.co/datasets/open-llm-leaderboard/details_hyunjae__polyglot-ko-3.8b-total/blob/main/results_2024-02-04T04-14-32.397569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29039509485054527,\n \"acc_stderr\": 0.03216532028586938,\n \"acc_norm\": 0.2927441218817452,\n \"acc_norm_stderr\": 0.0330314171661691,\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557977,\n \"mc2\": 0.43671199555198403,\n \"mc2_stderr\": 0.015413993944196269\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21843003412969283,\n \"acc_stderr\": 0.012074291605700968,\n \"acc_norm\": 0.25341296928327645,\n \"acc_norm_stderr\": 0.012710896778378607\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3405696076478789,\n \"acc_stderr\": 0.004729322613301549,\n \"acc_norm\": 0.39693288189603665,\n \"acc_norm_stderr\": 0.004882619484166608\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3320754716981132,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.3320754716981132,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080342,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080342\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491227,\n \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491227\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2903225806451613,\n \"acc_stderr\": 0.02582210611941589,\n \"acc_norm\": 0.2903225806451613,\n \"acc_norm_stderr\": 0.02582210611941589\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511783,\n \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511783\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.29292929292929293,\n \"acc_stderr\": 0.032424979581788145,\n \"acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.032424979581788145\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.02329088805377271,\n \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.02329088805377271\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766107,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766107\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3100917431192661,\n \"acc_stderr\": 0.019830849684439756,\n \"acc_norm\": 0.3100917431192661,\n \"acc_norm_stderr\": 0.019830849684439756\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.03338473403207401,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.03338473403207401\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.031145570659486782,\n \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.031145570659486782\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2242152466367713,\n \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.2242152466367713,\n \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n \"acc_stderr\": 0.03770970049347019,\n \"acc_norm\": 0.19642857142857142,\n \"acc_norm_stderr\": 0.03770970049347019\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.04721188506097172,\n \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.04721188506097172\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.30395913154533843,\n \"acc_stderr\": 0.016448321686769046,\n \"acc_norm\": 0.30395913154533843,\n \"acc_norm_stderr\": 0.016448321686769046\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.024547617794803835,\n \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.024547617794803835\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.026090162504279042,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.026090162504279042\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.2829581993569132,\n \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.025329888171900926,\n \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.025329888171900926\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26010430247718386,\n \"acc_stderr\": 0.01120438288782384,\n \"acc_norm\": 0.26010430247718386,\n \"acc_norm_stderr\": 0.01120438288782384\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.02967428828131118,\n \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.02967428828131118\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3183673469387755,\n \"acc_stderr\": 0.029822533793982055,\n \"acc_norm\": 0.3183673469387755,\n \"acc_norm_stderr\": 0.029822533793982055\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3283582089552239,\n \"acc_stderr\": 0.03320685889744324,\n \"acc_norm\": 0.3283582089552239,\n \"acc_norm_stderr\": 0.03320685889744324\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.03401052620104089,\n \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.03401052620104089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557977,\n \"mc2\": 0.43671199555198403,\n \"mc2_stderr\": 0.015413993944196269\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5335438042620363,\n \"acc_stderr\": 0.014020826677598094\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/hyunjae/polyglot-ko-3.8b-total", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|arc:challenge|25_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|gsm8k|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hellaswag|10_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T04-14-32.397569.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["**/details_harness|winogrande|5_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T04-14-32.397569.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T04_14_32.397569", "path": ["results_2024-02-04T04-14-32.397569.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T04-14-32.397569.parquet"]}]}]} | 2024-02-04T04:17:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of hyunjae/polyglot-ko-3.8b-total
Dataset automatically created during the evaluation run of model hyunjae/polyglot-ko-3.8b-total on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T04:14:32.397569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of hyunjae/polyglot-ko-3.8b-total\n\n\n\nDataset automatically created during the evaluation run of model hyunjae/polyglot-ko-3.8b-total on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T04:14:32.397569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of hyunjae/polyglot-ko-3.8b-total\n\n\n\nDataset automatically created during the evaluation run of model hyunjae/polyglot-ko-3.8b-total on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T04:14:32.397569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6c8f01bd0b72a01fa588f98331cb97e91b7c07c6 |
# Dataset Card for Evaluation run of abacusai/Smaug-72B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/Smaug-72B-v0.1](https://huggingface.co/abacusai/Smaug-72B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__Smaug-72B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T04:59:32.876763](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Smaug-72B-v0.1/blob/main/results_2024-02-04T04-59-32.876763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7716613011645818,
"acc_stderr": 0.02801089457302993,
"acc_norm": 0.7734062646949216,
"acc_norm_stderr": 0.028568963791437117,
"mc1": 0.6560587515299877,
"mc1_stderr": 0.016629087514276785,
"mc2": 0.7666613083747418,
"mc2_stderr": 0.014124410528709273
},
"harness|arc:challenge|25": {
"acc": 0.735494880546075,
"acc_stderr": 0.012889272949313371,
"acc_norm": 0.7602389078498294,
"acc_norm_stderr": 0.012476304127453944
},
"harness|hellaswag|10": {
"acc": 0.7199761003784106,
"acc_stderr": 0.004480929450281562,
"acc_norm": 0.8926508663612827,
"acc_norm_stderr": 0.0030892396746331585
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8452830188679246,
"acc_stderr": 0.022257075558791282,
"acc_norm": 0.8452830188679246,
"acc_norm_stderr": 0.022257075558791282
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9305555555555556,
"acc_stderr": 0.021257974822832048,
"acc_norm": 0.9305555555555556,
"acc_norm_stderr": 0.021257974822832048
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7914893617021277,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.7914893617021277,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7724137931034483,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.7724137931034483,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6904761904761905,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.6904761904761905,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6600985221674877,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.6600985221674877,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.016999994927421592,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.016999994927421592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084315,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637282,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637282
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4703703703703704,
"acc_stderr": 0.030431963547936584,
"acc_norm": 0.4703703703703704,
"acc_norm_stderr": 0.030431963547936584
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5629139072847682,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.5629139072847682,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9357798165137615,
"acc_stderr": 0.010510494713201403,
"acc_norm": 0.9357798165137615,
"acc_norm_stderr": 0.010510494713201403
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03179876342176853,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03179876342176853
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8931297709923665,
"acc_stderr": 0.027096548624883733,
"acc_norm": 0.8931297709923665,
"acc_norm_stderr": 0.027096548624883733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.033432700628696195,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.033432700628696195
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9169859514687101,
"acc_stderr": 0.009866287394639536,
"acc_norm": 0.9169859514687101,
"acc_norm_stderr": 0.009866287394639536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.019685307033571946,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.019685307033571946
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6960893854748603,
"acc_stderr": 0.01538284558758452,
"acc_norm": 0.6960893854748603,
"acc_norm_stderr": 0.01538284558758452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.02046417512433263,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.02046417512433263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.842443729903537,
"acc_stderr": 0.020692237273583984,
"acc_norm": 0.842443729903537,
"acc_norm_stderr": 0.020692237273583984
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.019061588181505405,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.019061588181505405
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6023468057366362,
"acc_stderr": 0.012499840347460642,
"acc_norm": 0.6023468057366362,
"acc_norm_stderr": 0.012499840347460642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.02257177102549473,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.02257177102549473
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.015697029240757773,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.015697029240757773
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007646,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007646
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659397,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276894,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276894
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6560587515299877,
"mc1_stderr": 0.016629087514276785,
"mc2": 0.7666613083747418,
"mc2_stderr": 0.014124410528709273
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.010012598805627305
},
"harness|gsm8k|5": {
"acc": 0.7869598180439727,
"acc_stderr": 0.01127844785690078
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abacusai__Smaug-72B-v0.1 | [
"region:us"
] | 2024-02-04T05:01:42+00:00 | {"pretty_name": "Evaluation run of abacusai/Smaug-72B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [abacusai/Smaug-72B-v0.1](https://huggingface.co/abacusai/Smaug-72B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__Smaug-72B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T04:59:32.876763](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Smaug-72B-v0.1/blob/main/results_2024-02-04T04-59-32.876763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7716613011645818,\n \"acc_stderr\": 0.02801089457302993,\n \"acc_norm\": 0.7734062646949216,\n \"acc_norm_stderr\": 0.028568963791437117,\n \"mc1\": 0.6560587515299877,\n \"mc1_stderr\": 0.016629087514276785,\n \"mc2\": 0.7666613083747418,\n \"mc2_stderr\": 0.014124410528709273\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.735494880546075,\n \"acc_stderr\": 0.012889272949313371,\n \"acc_norm\": 0.7602389078498294,\n \"acc_norm_stderr\": 0.012476304127453944\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7199761003784106,\n \"acc_stderr\": 0.004480929450281562,\n \"acc_norm\": 0.8926508663612827,\n \"acc_norm_stderr\": 0.0030892396746331585\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8452830188679246,\n \"acc_stderr\": 0.022257075558791282,\n \"acc_norm\": 0.8452830188679246,\n \"acc_norm_stderr\": 0.022257075558791282\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9305555555555556,\n \"acc_stderr\": 0.021257974822832048,\n \"acc_norm\": 0.9305555555555556,\n \"acc_norm_stderr\": 0.021257974822832048\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.026556982117838728,\n \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.026556982117838728\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6904761904761905,\n \"acc_stderr\": 0.023809523809523864,\n \"acc_norm\": 0.6904761904761905,\n \"acc_norm_stderr\": 0.023809523809523864\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.016999994927421592,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.016999994927421592\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084315,\n \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084315\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637282,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637282\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9357798165137615,\n \"acc_stderr\": 0.010510494713201403,\n \"acc_norm\": 0.9357798165137615,\n \"acc_norm_stderr\": 0.010510494713201403\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03179876342176853,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03179876342176853\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.033432700628696195,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.033432700628696195\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9169859514687101,\n \"acc_stderr\": 0.009866287394639536,\n \"acc_norm\": 0.9169859514687101,\n \"acc_norm_stderr\": 0.009866287394639536\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6960893854748603,\n \"acc_stderr\": 0.01538284558758452,\n \"acc_norm\": 0.6960893854748603,\n \"acc_norm_stderr\": 0.01538284558758452\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433263,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433263\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.842443729903537,\n \"acc_stderr\": 0.020692237273583984,\n \"acc_norm\": 0.842443729903537,\n \"acc_norm_stderr\": 0.020692237273583984\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505405,\n \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505405\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6023468057366362,\n \"acc_stderr\": 0.012499840347460642,\n \"acc_norm\": 0.6023468057366362,\n \"acc_norm_stderr\": 0.012499840347460642\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.02257177102549473,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.02257177102549473\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.815359477124183,\n \"acc_stderr\": 0.015697029240757773,\n \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.015697029240757773\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007646,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007646\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659397,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276894,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276894\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6560587515299877,\n \"mc1_stderr\": 0.016629087514276785,\n \"mc2\": 0.7666613083747418,\n \"mc2_stderr\": 0.014124410528709273\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627305\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7869598180439727,\n \"acc_stderr\": 0.01127844785690078\n }\n}\n```", "repo_url": "https://huggingface.co/abacusai/Smaug-72B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|arc:challenge|25_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|gsm8k|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hellaswag|10_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T04-59-32.876763.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["**/details_harness|winogrande|5_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T04-59-32.876763.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T04_59_32.876763", "path": ["results_2024-02-04T04-59-32.876763.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T04-59-32.876763.parquet"]}]}]} | 2024-02-04T05:02:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abacusai/Smaug-72B-v0.1
Dataset automatically created during the evaluation run of model abacusai/Smaug-72B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T04:59:32.876763(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abacusai/Smaug-72B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model abacusai/Smaug-72B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T04:59:32.876763(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abacusai/Smaug-72B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model abacusai/Smaug-72B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T04:59:32.876763(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
16400e95cc93cd791272cf5d378703b15dfd81b6 | தமிழ் வெண்பாக்கள் ~5000, பதவுரை குறிப்புரையுடன். | RajuKandasamy/venba5k | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:ta",
"license:gpl-3.0",
"region:us"
] | 2024-02-04T05:07:34+00:00 | {"language": ["ta"], "license": "gpl-3.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]} | 2024-02-04T05:19:30+00:00 | [] | [
"ta"
] | TAGS
#task_categories-text-generation #size_categories-1K<n<10K #language-Tamil #license-gpl-3.0 #region-us
| தமிழ் வெண்பாக்கள் ~5000, பதவுரை குறிப்புரையுடன். | [] | [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Tamil #license-gpl-3.0 #region-us \n"
] |
2656dbfa0467a24e38e9262ba1c3c201fdf5c22c |
# Dataset Card for Evaluation run of ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000](https://huggingface.co/ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6_new_tokenizerstep_8000",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T05:44:47.324523](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6_new_tokenizerstep_8000/blob/main/results_2024-02-04T05-44-47.324523.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46820675310469034,
"acc_stderr": 0.03440705501604909,
"acc_norm": 0.4731459909548686,
"acc_norm_stderr": 0.03518338619579527,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502363,
"mc2": 0.41058631463459444,
"mc2_stderr": 0.014313375894275027
},
"harness|arc:challenge|25": {
"acc": 0.5034129692832765,
"acc_stderr": 0.014611050403244077,
"acc_norm": 0.5477815699658704,
"acc_norm_stderr": 0.014544519880633827
},
"harness|hellaswag|10": {
"acc": 0.5868352917745469,
"acc_stderr": 0.00491395570508013,
"acc_norm": 0.7863971320454093,
"acc_norm_stderr": 0.004090119686697031
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.03056159042673184,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.03056159042673184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666666,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.03784271932887467,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.03784271932887467
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149354,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149354
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681848,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681848
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165635,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165635
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6580310880829016,
"acc_stderr": 0.03423465100104283,
"acc_norm": 0.6580310880829016,
"acc_norm_stderr": 0.03423465100104283
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4358974358974359,
"acc_stderr": 0.02514180151117749,
"acc_norm": 0.4358974358974359,
"acc_norm_stderr": 0.02514180151117749
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6201834862385321,
"acc_stderr": 0.020808825617866244,
"acc_norm": 0.6201834862385321,
"acc_norm_stderr": 0.020808825617866244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605583,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605583
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239171,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239171
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6413502109704642,
"acc_stderr": 0.03121956944530185,
"acc_norm": 0.6413502109704642,
"acc_norm_stderr": 0.03121956944530185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.033188332862172806,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.033188332862172806
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.5339805825242718,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.5339805825242718,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7350427350427351,
"acc_stderr": 0.028911208802749448,
"acc_norm": 0.7350427350427351,
"acc_norm_stderr": 0.028911208802749448
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6475095785440613,
"acc_stderr": 0.01708415024408138,
"acc_norm": 0.6475095785440613,
"acc_norm_stderr": 0.01708415024408138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364564,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364564
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347663,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347663
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.028538650028878638,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.028538650028878638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34876140808344197,
"acc_stderr": 0.01217203515712712,
"acc_norm": 0.34876140808344197,
"acc_norm_stderr": 0.01217203515712712
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.020184583359102202,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.020184583359102202
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.03189141832421397,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.03189141832421397
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.03400598505599014,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.03400598505599014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502363,
"mc2": 0.41058631463459444,
"mc2_stderr": 0.014313375894275027
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
},
"harness|gsm8k|5": {
"acc": 0.14859742228961334,
"acc_stderr": 0.009797503180527897
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6_new_tokenizerstep_8000 | [
"region:us"
] | 2024-02-04T05:47:07+00:00 | {"pretty_name": "Evaluation run of ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000](https://huggingface.co/ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6_new_tokenizerstep_8000\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T05:44:47.324523](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6_new_tokenizerstep_8000/blob/main/results_2024-02-04T05-44-47.324523.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46820675310469034,\n \"acc_stderr\": 0.03440705501604909,\n \"acc_norm\": 0.4731459909548686,\n \"acc_norm_stderr\": 0.03518338619579527,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502363,\n \"mc2\": 0.41058631463459444,\n \"mc2_stderr\": 0.014313375894275027\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5034129692832765,\n \"acc_stderr\": 0.014611050403244077,\n \"acc_norm\": 0.5477815699658704,\n \"acc_norm_stderr\": 0.014544519880633827\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5868352917745469,\n \"acc_stderr\": 0.00491395570508013,\n \"acc_norm\": 0.7863971320454093,\n \"acc_norm_stderr\": 0.004090119686697031\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.03056159042673184,\n \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.03056159042673184\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149354,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149354\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n \"acc_stderr\": 0.028434533152681848,\n \"acc_norm\": 0.4870967741935484,\n \"acc_norm_stderr\": 0.028434533152681848\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165635,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165635\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104283,\n \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104283\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.02514180151117749,\n \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.02514180151117749\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478465,\n \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478465\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6201834862385321,\n \"acc_stderr\": 0.020808825617866244,\n \"acc_norm\": 0.6201834862385321,\n \"acc_norm_stderr\": 0.020808825617866244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605583,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605583\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239171,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239171\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6413502109704642,\n \"acc_stderr\": 0.03121956944530185,\n \"acc_norm\": 0.6413502109704642,\n \"acc_norm_stderr\": 0.03121956944530185\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.5739910313901345,\n \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5339805825242718,\n \"acc_stderr\": 0.0493929144727348,\n \"acc_norm\": 0.5339805825242718,\n \"acc_norm_stderr\": 0.0493929144727348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7350427350427351,\n \"acc_stderr\": 0.028911208802749448,\n \"acc_norm\": 0.7350427350427351,\n \"acc_norm_stderr\": 0.028911208802749448\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6475095785440613,\n \"acc_stderr\": 0.01708415024408138,\n \"acc_norm\": 0.6475095785440613,\n \"acc_norm_stderr\": 0.01708415024408138\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n \"acc_stderr\": 0.014696599650364564,\n \"acc_norm\": 0.26145251396648045,\n \"acc_norm_stderr\": 0.014696599650364564\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.028629305194003543,\n \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.028629305194003543\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347663,\n \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347663\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878638,\n \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878638\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34876140808344197,\n \"acc_stderr\": 0.01217203515712712,\n \"acc_norm\": 0.34876140808344197,\n \"acc_norm_stderr\": 0.01217203515712712\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03032024326500413,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03032024326500413\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4673202614379085,\n \"acc_stderr\": 0.020184583359102202,\n \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.020184583359102202\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.03189141832421397,\n \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.03189141832421397\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n \"acc_stderr\": 0.03400598505599014,\n \"acc_norm\": 0.6368159203980099,\n \"acc_norm_stderr\": 0.03400598505599014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502363,\n \"mc2\": 0.41058631463459444,\n \"mc2_stderr\": 0.014313375894275027\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14859742228961334,\n \"acc_stderr\": 0.009797503180527897\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|arc:challenge|25_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|gsm8k|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hellaswag|10_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T05-44-47.324523.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["**/details_harness|winogrande|5_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T05-44-47.324523.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T05_44_47.324523", "path": ["results_2024-02-04T05-44-47.324523.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T05-44-47.324523.parquet"]}]}]} | 2024-02-04T05:47:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000
Dataset automatically created during the evaluation run of model ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T05:44:47.324523(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T05:44:47.324523(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/llama_ppo_1e6_new_tokenizerstep_8000 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T05:44:47.324523(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
caa33c2b45837412d78bdefef01c73a4001d6678 |
# Dataset Card for Evaluation run of Xenon1/Xenon-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Xenon-2](https://huggingface.co/Xenon1/Xenon-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Xenon-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T06:12:02.533173](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Xenon-2/blob/main/results_2024-02-04T06-12-02.533173.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5979349234673121,
"acc_stderr": 0.03314667589298127,
"acc_norm": 0.6059085706273892,
"acc_norm_stderr": 0.03386834152845098,
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877103,
"mc2": 0.6091942381109575,
"mc2_stderr": 0.016102758925844906
},
"harness|arc:challenge|25": {
"acc": 0.515358361774744,
"acc_stderr": 0.014604496129394906,
"acc_norm": 0.5750853242320819,
"acc_norm_stderr": 0.014445698968520762
},
"harness|hellaswag|10": {
"acc": 0.6417048396733719,
"acc_stderr": 0.0047851950498891595,
"acc_norm": 0.8328022306313483,
"acc_norm_stderr": 0.0037238973056454845
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.041042692118062316,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.041042692118062316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057086,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057086
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932015,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915332,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915332
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042345,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042345
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.017493922404112648,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.017493922404112648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293437,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293437
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.032361983509282745,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.032361983509282745
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707779,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707779
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937144,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937144
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546665,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761992,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761992
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508748,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119546,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119546
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144373,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144373
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4015645371577575,
"acc_stderr": 0.01252031512014711,
"acc_norm": 0.4015645371577575,
"acc_norm_stderr": 0.01252031512014711
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.019643801557924803,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.019643801557924803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982062,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982062
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877103,
"mc2": 0.6091942381109575,
"mc2_stderr": 0.016102758925844906
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.19408642911296436,
"acc_stderr": 0.010893918308192412
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Xenon1__Xenon-2 | [
"region:us"
] | 2024-02-04T06:14:24+00:00 | {"pretty_name": "Evaluation run of Xenon1/Xenon-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/Xenon-2](https://huggingface.co/Xenon1/Xenon-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Xenon-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T06:12:02.533173](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Xenon-2/blob/main/results_2024-02-04T06-12-02.533173.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5979349234673121,\n \"acc_stderr\": 0.03314667589298127,\n \"acc_norm\": 0.6059085706273892,\n \"acc_norm_stderr\": 0.03386834152845098,\n \"mc1\": 0.4430844553243574,\n \"mc1_stderr\": 0.017389730346877103,\n \"mc2\": 0.6091942381109575,\n \"mc2_stderr\": 0.016102758925844906\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.515358361774744,\n \"acc_stderr\": 0.014604496129394906,\n \"acc_norm\": 0.5750853242320819,\n \"acc_norm_stderr\": 0.014445698968520762\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6417048396733719,\n \"acc_stderr\": 0.0047851950498891595,\n \"acc_norm\": 0.8328022306313483,\n \"acc_norm_stderr\": 0.0037238973056454845\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.041042692118062316,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.041042692118062316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057086,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057086\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n \"acc_stderr\": 0.02704574657353433,\n \"acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.02704574657353433\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932015,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932015\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915332,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915332\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042345,\n \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042345\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7889908256880734,\n \"acc_stderr\": 0.017493922404112648,\n \"acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.017493922404112648\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293437,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293437\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.032361983509282745,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.032361983509282745\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707779,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707779\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.014927447101937144,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.014927447101937144\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.014736926383761992,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.014736926383761992\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508748,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119546,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119546\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144373,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144373\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4015645371577575,\n \"acc_stderr\": 0.01252031512014711,\n \"acc_norm\": 0.4015645371577575,\n \"acc_norm_stderr\": 0.01252031512014711\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924803,\n \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924803\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982062,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982062\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4430844553243574,\n \"mc1_stderr\": 0.017389730346877103,\n \"mc2\": 0.6091942381109575,\n \"mc2_stderr\": 0.016102758925844906\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19408642911296436,\n \"acc_stderr\": 0.010893918308192412\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/Xenon-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-12-02.533173.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["**/details_harness|winogrande|5_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T06-12-02.533173.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T06_12_02.533173", "path": ["results_2024-02-04T06-12-02.533173.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T06-12-02.533173.parquet"]}]}]} | 2024-02-04T06:14:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Xenon1/Xenon-2
Dataset automatically created during the evaluation run of model Xenon1/Xenon-2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T06:12:02.533173(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Xenon1/Xenon-2\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Xenon-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:12:02.533173(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Xenon1/Xenon-2\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Xenon-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:12:02.533173(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
38552277858d8060592ea411d2d1d4d9b8cd44cd |
# Dataset Card for Evaluation run of Xenon1/Xenon-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Xenon-1](https://huggingface.co/Xenon1/Xenon-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Xenon-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T06:12:50.930394](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Xenon-1/blob/main/results_2024-02-04T06-12-50.930394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6068222012098166,
"acc_stderr": 0.03298659925407221,
"acc_norm": 0.6146553607348512,
"acc_norm_stderr": 0.033698742335164816,
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.5667949416021936,
"mc2_stderr": 0.015086586739744114
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.552901023890785,
"acc_norm_stderr": 0.014529380160526852
},
"harness|hellaswag|10": {
"acc": 0.6147181836287592,
"acc_stderr": 0.004856672322044455,
"acc_norm": 0.81557458673571,
"acc_norm_stderr": 0.0038703811999679645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.02493931390694079,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.02493931390694079
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203627,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990905,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990905
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3329608938547486,
"acc_stderr": 0.015761716178397563,
"acc_norm": 0.3329608938547486,
"acc_norm_stderr": 0.015761716178397563
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41395045632333766,
"acc_stderr": 0.012579699631289265,
"acc_norm": 0.41395045632333766,
"acc_norm_stderr": 0.012579699631289265
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354022,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354022
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401466,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.5667949416021936,
"mc2_stderr": 0.015086586739744114
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.01150895769072275
},
"harness|gsm8k|5": {
"acc": 0.21834723275208492,
"acc_stderr": 0.011379497266738047
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Xenon1__Xenon-1 | [
"region:us"
] | 2024-02-04T06:15:07+00:00 | {"pretty_name": "Evaluation run of Xenon1/Xenon-1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/Xenon-1](https://huggingface.co/Xenon1/Xenon-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Xenon-1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T06:12:50.930394](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Xenon-1/blob/main/results_2024-02-04T06-12-50.930394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6068222012098166,\n \"acc_stderr\": 0.03298659925407221,\n \"acc_norm\": 0.6146553607348512,\n \"acc_norm_stderr\": 0.033698742335164816,\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5667949416021936,\n \"mc2_stderr\": 0.015086586739744114\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \"acc_norm\": 0.552901023890785,\n \"acc_norm_stderr\": 0.014529380160526852\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6147181836287592,\n \"acc_stderr\": 0.004856672322044455,\n \"acc_norm\": 0.81557458673571,\n \"acc_norm_stderr\": 0.0038703811999679645\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.6967741935483871,\n \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.02493931390694079,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.02493931390694079\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.020237149008990905,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.020237149008990905\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3329608938547486,\n \"acc_stderr\": 0.015761716178397563,\n \"acc_norm\": 0.3329608938547486,\n \"acc_norm_stderr\": 0.015761716178397563\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41395045632333766,\n \"acc_stderr\": 0.012579699631289265,\n \"acc_norm\": 0.41395045632333766,\n \"acc_norm_stderr\": 0.012579699631289265\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354022,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354022\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.03036049015401466,\n \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.03036049015401466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5667949416021936,\n \"mc2_stderr\": 0.015086586739744114\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.01150895769072275\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21834723275208492,\n \"acc_stderr\": 0.011379497266738047\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/Xenon-1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-12-50.930394.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["**/details_harness|winogrande|5_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T06-12-50.930394.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T06_12_50.930394", "path": ["results_2024-02-04T06-12-50.930394.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T06-12-50.930394.parquet"]}]}]} | 2024-02-04T06:15:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Xenon1/Xenon-1
Dataset automatically created during the evaluation run of model Xenon1/Xenon-1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T06:12:50.930394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Xenon1/Xenon-1\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Xenon-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:12:50.930394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Xenon1/Xenon-1\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Xenon-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:12:50.930394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e71e761f1b9b26d19546bd0091c42f55f7f5e177 |
# Dataset Card for Evaluation run of Xenon1/Xenon-3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Xenon-3](https://huggingface.co/Xenon1/Xenon-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Xenon-3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T06:17:00.100146](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Xenon-3/blob/main/results_2024-02-04T06-17-00.100146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5939865594001623,
"acc_stderr": 0.033253288284865026,
"acc_norm": 0.601762868440447,
"acc_norm_stderr": 0.03397316934217102,
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.619926076675378,
"mc2_stderr": 0.016177639715307644
},
"harness|arc:challenge|25": {
"acc": 0.5281569965870307,
"acc_stderr": 0.014588204105102203,
"acc_norm": 0.5887372013651877,
"acc_norm_stderr": 0.014379441068522087
},
"harness|hellaswag|10": {
"acc": 0.647679745070703,
"acc_stderr": 0.004767168250414608,
"acc_norm": 0.8338976299541924,
"acc_norm_stderr": 0.003714118884317383
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115978,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115978
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302837,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871934,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871934
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.01787121776779024,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.01787121776779024
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333557,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333557
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546665,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.01473692638376199,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.01473692638376199
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281413,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.02638527370346449,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.02638527370346449
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41264667535853977,
"acc_stderr": 0.01257383663379901,
"acc_norm": 0.41264667535853977,
"acc_norm_stderr": 0.01257383663379901
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6094771241830066,
"acc_stderr": 0.019737008998094597,
"acc_norm": 0.6094771241830066,
"acc_norm_stderr": 0.019737008998094597
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440313,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440313
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.619926076675378,
"mc2_stderr": 0.016177639715307644
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.20090978013646701,
"acc_stderr": 0.01103673822187237
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Xenon1__Xenon-3 | [
"region:us"
] | 2024-02-04T06:19:19+00:00 | {"pretty_name": "Evaluation run of Xenon1/Xenon-3", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/Xenon-3](https://huggingface.co/Xenon1/Xenon-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Xenon-3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T06:17:00.100146](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Xenon-3/blob/main/results_2024-02-04T06-17-00.100146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5939865594001623,\n \"acc_stderr\": 0.033253288284865026,\n \"acc_norm\": 0.601762868440447,\n \"acc_norm_stderr\": 0.03397316934217102,\n \"mc1\": 0.4455324357405141,\n \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.619926076675378,\n \"mc2_stderr\": 0.016177639715307644\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5281569965870307,\n \"acc_stderr\": 0.014588204105102203,\n \"acc_norm\": 0.5887372013651877,\n \"acc_norm_stderr\": 0.014379441068522087\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.647679745070703,\n \"acc_stderr\": 0.004767168250414608,\n \"acc_norm\": 0.8338976299541924,\n \"acc_norm_stderr\": 0.003714118884317383\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115978,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115978\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871934,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871934\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7761467889908257,\n \"acc_stderr\": 0.01787121776779024,\n \"acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.01787121776779024\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658342,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658342\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n \"acc_stderr\": 0.014836205167333557,\n \"acc_norm\": 0.7790549169859514,\n \"acc_norm_stderr\": 0.014836205167333557\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.01473692638376199,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.01473692638376199\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281413,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281413\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.02638527370346449,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.02638527370346449\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n \"acc_stderr\": 0.01257383663379901,\n \"acc_norm\": 0.41264667535853977,\n \"acc_norm_stderr\": 0.01257383663379901\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6094771241830066,\n \"acc_stderr\": 0.019737008998094597,\n \"acc_norm\": 0.6094771241830066,\n \"acc_norm_stderr\": 0.019737008998094597\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440313,\n \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440313\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4455324357405141,\n \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.619926076675378,\n \"mc2_stderr\": 0.016177639715307644\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20090978013646701,\n \"acc_stderr\": 0.01103673822187237\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/Xenon-3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-17-00.100146.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["**/details_harness|winogrande|5_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T06-17-00.100146.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T06_17_00.100146", "path": ["results_2024-02-04T06-17-00.100146.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T06-17-00.100146.parquet"]}]}]} | 2024-02-04T06:19:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Xenon1/Xenon-3
Dataset automatically created during the evaluation run of model Xenon1/Xenon-3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T06:17:00.100146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Xenon1/Xenon-3\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Xenon-3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:17:00.100146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Xenon1/Xenon-3\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Xenon-3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:17:00.100146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
69233e871eabb92b99afac0e2984d37a3cb94f99 |
This dataset was create semi-synthetically using a RAG system containing crop nutrition and environmental conditions requirements for various plants, sourced from agricultural college data, along with open nutrient projects data, connected to a ChatGPT4 API, put together by Copyleft Cultivars Nonprofit, then cleaned lightly by Caleb DeLeeuw.
The dataset is in json. | CopyleftCultivars/SemiSynthetic_Crop_Requirements | [
"license:other",
"biology",
"climate",
"region:us"
] | 2024-02-04T06:22:46+00:00 | {"license": "other", "pretty_name": "Semi-Synthetic Crop Requirements Data", "license_name": "hl3-cl-eco-extr", "license_link": "https://firstdonoharm.dev/version/3/0/cl-eco-extr.html", "tags": ["biology", "climate"]} | 2024-02-08T06:44:27+00:00 | [] | [] | TAGS
#license-other #biology #climate #region-us
|
This dataset was create semi-synthetically using a RAG system containing crop nutrition and environmental conditions requirements for various plants, sourced from agricultural college data, along with open nutrient projects data, connected to a ChatGPT4 API, put together by Copyleft Cultivars Nonprofit, then cleaned lightly by Caleb DeLeeuw.
The dataset is in json. | [] | [
"TAGS\n#license-other #biology #climate #region-us \n"
] |
9f672c78cab0c1169899aa5119604c65b426aa89 |
# Dataset Card for Evaluation run of Aryanne/YarnLake-Swap-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Aryanne/YarnLake-Swap-7B](https://huggingface.co/Aryanne/YarnLake-Swap-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aryanne__YarnLake-Swap-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T06:28:34.013587](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__YarnLake-Swap-7B/blob/main/results_2024-02-04T06-28-34.013587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6487764218521861,
"acc_stderr": 0.03199354917196398,
"acc_norm": 0.6513693263333427,
"acc_norm_stderr": 0.03263625404197957,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.490726463865428,
"mc2_stderr": 0.014923584147103909
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349815,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.01391303452962045
},
"harness|hellaswag|10": {
"acc": 0.663114917347142,
"acc_stderr": 0.004716792874433208,
"acc_norm": 0.8517227643895638,
"acc_norm_stderr": 0.0035464830155691185
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406793,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406793
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970572,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970572
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391545,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391545
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982477,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701772,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.490726463865428,
"mc2_stderr": 0.014923584147103909
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.5519332827899924,
"acc_stderr": 0.01369799266827452
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Aryanne__YarnLake-Swap-7B | [
"region:us"
] | 2024-02-04T06:30:53+00:00 | {"pretty_name": "Evaluation run of Aryanne/YarnLake-Swap-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aryanne/YarnLake-Swap-7B](https://huggingface.co/Aryanne/YarnLake-Swap-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aryanne__YarnLake-Swap-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T06:28:34.013587](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__YarnLake-Swap-7B/blob/main/results_2024-02-04T06-28-34.013587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6487764218521861,\n \"acc_stderr\": 0.03199354917196398,\n \"acc_norm\": 0.6513693263333427,\n \"acc_norm_stderr\": 0.03263625404197957,\n \"mc1\": 0.3353733170134639,\n \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.490726463865428,\n \"mc2_stderr\": 0.014923584147103909\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349815,\n \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.01391303452962045\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.663114917347142,\n \"acc_stderr\": 0.004716792874433208,\n \"acc_norm\": 0.8517227643895638,\n \"acc_norm_stderr\": 0.0035464830155691185\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406793,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406793\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970572,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970572\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391545,\n \"acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391545\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577612,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577612\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982477,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982477\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n \"acc_stderr\": 0.012718456618701772,\n \"acc_norm\": 0.455019556714472,\n \"acc_norm_stderr\": 0.012718456618701772\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.490726463865428,\n \"mc2_stderr\": 0.014923584147103909\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5519332827899924,\n \"acc_stderr\": 0.01369799266827452\n }\n}\n```", "repo_url": "https://huggingface.co/Aryanne/YarnLake-Swap-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-28-34.013587.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["**/details_harness|winogrande|5_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T06-28-34.013587.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T06_28_34.013587", "path": ["results_2024-02-04T06-28-34.013587.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T06-28-34.013587.parquet"]}]}]} | 2024-02-04T06:31:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Aryanne/YarnLake-Swap-7B
Dataset automatically created during the evaluation run of model Aryanne/YarnLake-Swap-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T06:28:34.013587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Aryanne/YarnLake-Swap-7B\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/YarnLake-Swap-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:28:34.013587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Aryanne/YarnLake-Swap-7B\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/YarnLake-Swap-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:28:34.013587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
a2c8c62a6300b9c0f0cd09cf7ff1663aa478dcc8 |
# Dataset Card for HCSA Forest Plot Data 2023
• Title: High Carbon Stock Approach (HCSA) Forest Plot Data 2023
• Description: This dataset contains information of forest field plot inventory data collected using the High Carbon Stock Approach (HCSA) methodology. This data was collected to provide validation and training data for large scale indicative HCS forest maps produced with the HCSA Largescale Mapping Framework (https://highcarbonstock.org/wp-content/uploads/2023/02/HCSA-Large-Scale-MAP-FWK-Procedure-1.pdf) under a project funded by the GIZ Fair Forward Initiative
• The data includes various parameters related to land cover, carbon content, tree characteristics, and biomass calculations.
## Dataset Details
• X Coordinate: The horizontal position of the plot – WGS 84
• Y Coordinate: The vertical position of the plot – WGS 84
• Land Cover Update: Updated HCS forest class from plot Carbon stock
• Land Cover Indicatives: Indicative HCS forest class from Lang et al., 2021 https://arxiv.org/abs/2107.07431
• Carbon (ton/ha): Carbon content per hectare in the forest field plot.
• Plot Area (Ha): The total area covered by the forest field plot in hectares (See method figure)
• Tree Number: ID of the individual trees in the plot
• DBH (cm): Diameter at Breast Height, measured in centimeters.
• Height (cm): The height of the trees measured in centimeters.
• Height (m): The height of the trees converted to meters.
• Local Name: Common name or local name of the tree species.
• Scientific Name: The scientific name of the tree species.
• Family: Taxonomic family to which the tree species belongs.
• Wood Density (g/cm3): The density of wood in grams per cubic centimeter.
• Biomass (kg)/pohon: Biomass per tree measured in kilograms.
• Biomass (ton)/pohon: Biomass per tree converted to metric tons.
• Biomass (ton/ha): Total biomass per hectare calculated from the field plot.
### Dataset Description
• High Carbon Stock Approach (HCSA): The dataset contains forest field plot data collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (https://highcarbonstock.org/wp-content/uploads/2017/09/HCSA-Toolkit-v2.0-Module-4-Forest-and-vegetation-stratification-190917-web.pdf).
- **Curated by:** High Carbon Stock Approach
- **Funded by :** Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ)
- **Language(s) (NLP):** English, Bahasa Indonesia
- **License:** cc-by-sa-4.0
### Dataset Sources
- **Repository:** High Carbon Stock Approach - ArcGIS Online -
## Uses
• Purpose: The dataset is intended for research and analysis related to forest ecology, carbon sequestration, and biodiversity.
• Citation: If used in publications or research, please cite the dataset as follows: High Carbon Stock Approach (2023). Forest Field Plot Data for Indicative HCS Mapping, Indonesia.
### Direct Use
This data is intended to support identification of indicative HCS forests for HCSA Landscape and Jurisdictional approach implementation, for Smallholder Approaches, or a preliminary step in an HCS assessment process.
### Out-of-Scope Use
This dataset is not to be used for: Sale of carbon credits on community lands without proper FPIC, consent, and mapping exercises.
Indicative HCS forest maps are probability maps and do not represent HCS Assessments or final maps from implementation of the HCS Landscape and Jurisdicional Approaches methodologies.
## Dataset Structure
This dataset contains a .xls file of field plot data
### Curation Rationale
The High Carbon Stock Approach (HCSA) core mission is to halt deforestation resulting from commodity production. It is a tool that identifies and conserves natural forests from degraded lands in tropical landscapes, amplifying the role of forest conservation as a nature-based solution to climate change, while at the same time supporting biodiversity conservation, community rights and benefits, and responsible development. To understand the distribution of High Carbon Stock (HCS) forests throughout the landscape and work to protect them in collaboration with smallholder farmers, communities, as well as local and national governments, it is necessary to accurately map these resources in a regional and national scale. Field plot data is an essential step in landscape and jurisdictional implementation of HCS for commodities including Palm Oil, Rubber, and Cocoa.
#### Data Collection and Processing
• High Carbon Stock Approach (HCSA): The dataset is collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (https://highcarbonstock.org/wp-content/uploads/2017/09/HCSA-Toolkit-v2.0-Module-4-Forest-and-vegetation-stratification-190917-web.pdf).
• Data Collection Methods: Data was collected in partnership with JKPP (https://jkpp.org/), and Indonesian civil society organization focusing on community land rights and mapping. General field data collection locations were selected based on access to different landcover types and existing relationships with communities. Initial community consultation and FPIC was conducted by JKPP between April and June 2023. Specific field plots locations were selected based on access and guided by the existing indicative HCS forest map (to provide evaluation data for existing maps and training data for updated classifications). Field plots consist of nested plots using a standardized collection method which is described in detain in HCSA Toolkit Module 4 (https://highcarbonstock.org/wp-content/uploads/2017/09/HCSA-Toolkit-v2.0-Module-4-Forest-and-vegetation-stratification-190917-web.pdf).


#### Who are the source data producers?
This data was collected by field teams from JKPP (https://jkpp.org/)
#### Personal and Sensitive Information
This data does not contain personally identifiable information
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Dataset Card Contact
[email protected] | HCSA/Forest_Plot_Data_2023 | [
"license:cc-by-sa-4.0",
"arxiv:2107.07431",
"region:us"
] | 2024-02-04T06:49:29+00:00 | {"license": "cc-by-sa-4.0"} | 2024-02-16T03:09:11+00:00 | [
"2107.07431"
] | [] | TAGS
#license-cc-by-sa-4.0 #arxiv-2107.07431 #region-us
|
# Dataset Card for HCSA Forest Plot Data 2023
• Title: High Carbon Stock Approach (HCSA) Forest Plot Data 2023
• Description: This dataset contains information of forest field plot inventory data collected using the High Carbon Stock Approach (HCSA) methodology. This data was collected to provide validation and training data for large scale indicative HCS forest maps produced with the HCSA Largescale Mapping Framework (URL under a project funded by the GIZ Fair Forward Initiative
• The data includes various parameters related to land cover, carbon content, tree characteristics, and biomass calculations.
## Dataset Details
• X Coordinate: The horizontal position of the plot – WGS 84
• Y Coordinate: The vertical position of the plot – WGS 84
• Land Cover Update: Updated HCS forest class from plot Carbon stock
• Land Cover Indicatives: Indicative HCS forest class from Lang et al., 2021 URL
• Carbon (ton/ha): Carbon content per hectare in the forest field plot.
• Plot Area (Ha): The total area covered by the forest field plot in hectares (See method figure)
• Tree Number: ID of the individual trees in the plot
• DBH (cm): Diameter at Breast Height, measured in centimeters.
• Height (cm): The height of the trees measured in centimeters.
• Height (m): The height of the trees converted to meters.
• Local Name: Common name or local name of the tree species.
• Scientific Name: The scientific name of the tree species.
• Family: Taxonomic family to which the tree species belongs.
• Wood Density (g/cm3): The density of wood in grams per cubic centimeter.
• Biomass (kg)/pohon: Biomass per tree measured in kilograms.
• Biomass (ton)/pohon: Biomass per tree converted to metric tons.
• Biomass (ton/ha): Total biomass per hectare calculated from the field plot.
### Dataset Description
• High Carbon Stock Approach (HCSA): The dataset contains forest field plot data collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (URL
- Curated by: High Carbon Stock Approach
- Funded by : Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ)
- Language(s) (NLP): English, Bahasa Indonesia
- License: cc-by-sa-4.0
### Dataset Sources
- Repository: High Carbon Stock Approach - ArcGIS Online -
## Uses
• Purpose: The dataset is intended for research and analysis related to forest ecology, carbon sequestration, and biodiversity.
• Citation: If used in publications or research, please cite the dataset as follows: High Carbon Stock Approach (2023). Forest Field Plot Data for Indicative HCS Mapping, Indonesia.
### Direct Use
This data is intended to support identification of indicative HCS forests for HCSA Landscape and Jurisdictional approach implementation, for Smallholder Approaches, or a preliminary step in an HCS assessment process.
### Out-of-Scope Use
This dataset is not to be used for: Sale of carbon credits on community lands without proper FPIC, consent, and mapping exercises.
Indicative HCS forest maps are probability maps and do not represent HCS Assessments or final maps from implementation of the HCS Landscape and Jurisdicional Approaches methodologies.
## Dataset Structure
This dataset contains a .xls file of field plot data
### Curation Rationale
The High Carbon Stock Approach (HCSA) core mission is to halt deforestation resulting from commodity production. It is a tool that identifies and conserves natural forests from degraded lands in tropical landscapes, amplifying the role of forest conservation as a nature-based solution to climate change, while at the same time supporting biodiversity conservation, community rights and benefits, and responsible development. To understand the distribution of High Carbon Stock (HCS) forests throughout the landscape and work to protect them in collaboration with smallholder farmers, communities, as well as local and national governments, it is necessary to accurately map these resources in a regional and national scale. Field plot data is an essential step in landscape and jurisdictional implementation of HCS for commodities including Palm Oil, Rubber, and Cocoa.
#### Data Collection and Processing
• High Carbon Stock Approach (HCSA): The dataset is collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (URL
• Data Collection Methods: Data was collected in partnership with JKPP (URL and Indonesian civil society organization focusing on community land rights and mapping. General field data collection locations were selected based on access to different landcover types and existing relationships with communities. Initial community consultation and FPIC was conducted by JKPP between April and June 2023. Specific field plots locations were selected based on access and guided by the existing indicative HCS forest map (to provide evaluation data for existing maps and training data for updated classifications). Field plots consist of nested plots using a standardized collection method which is described in detain in HCSA Toolkit Module 4 (URL
!image/png
!image/png
#### Who are the source data producers?
This data was collected by field teams from JKPP (URL
#### Personal and Sensitive Information
This data does not contain personally identifiable information
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Dataset Card Contact
info@URL | [
"# Dataset Card for HCSA Forest Plot Data 2023\n\n•\tTitle: High Carbon Stock Approach (HCSA) Forest Plot Data 2023\n•\tDescription: This dataset contains information of forest field plot inventory data collected using the High Carbon Stock Approach (HCSA) methodology. This data was collected to provide validation and training data for large scale indicative HCS forest maps produced with the HCSA Largescale Mapping Framework (URL under a project funded by the GIZ Fair Forward Initiative\n•\tThe data includes various parameters related to land cover, carbon content, tree characteristics, and biomass calculations.",
"## Dataset Details\n\n•\tX Coordinate: The horizontal position of the plot – WGS 84\n•\tY Coordinate: The vertical position of the plot – WGS 84\n•\tLand Cover Update: Updated HCS forest class from plot Carbon stock\n•\tLand Cover Indicatives: Indicative HCS forest class from Lang et al., 2021 URL\n•\tCarbon (ton/ha): Carbon content per hectare in the forest field plot.\n•\tPlot Area (Ha): The total area covered by the forest field plot in hectares (See method figure)\n•\tTree Number: ID of the individual trees in the plot\n•\tDBH (cm): Diameter at Breast Height, measured in centimeters.\n•\tHeight (cm): The height of the trees measured in centimeters.\n•\tHeight (m): The height of the trees converted to meters.\n•\tLocal Name: Common name or local name of the tree species.\n•\tScientific Name: The scientific name of the tree species.\n•\tFamily: Taxonomic family to which the tree species belongs.\n•\tWood Density (g/cm3): The density of wood in grams per cubic centimeter.\n•\tBiomass (kg)/pohon: Biomass per tree measured in kilograms.\n•\tBiomass (ton)/pohon: Biomass per tree converted to metric tons.\n•\tBiomass (ton/ha): Total biomass per hectare calculated from the field plot.",
"### Dataset Description\n\n•\tHigh Carbon Stock Approach (HCSA): The dataset contains forest field plot data collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (URL \n\n\n\n- Curated by: High Carbon Stock Approach\n- Funded by : Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ)\n- Language(s) (NLP): English, Bahasa Indonesia\n- License: cc-by-sa-4.0",
"### Dataset Sources \n\n- Repository: High Carbon Stock Approach - ArcGIS Online -",
"## Uses\n\n•\tPurpose: The dataset is intended for research and analysis related to forest ecology, carbon sequestration, and biodiversity.\n•\tCitation: If used in publications or research, please cite the dataset as follows: High Carbon Stock Approach (2023). Forest Field Plot Data for Indicative HCS Mapping, Indonesia.",
"### Direct Use\n\nThis data is intended to support identification of indicative HCS forests for HCSA Landscape and Jurisdictional approach implementation, for Smallholder Approaches, or a preliminary step in an HCS assessment process.",
"### Out-of-Scope Use\n\nThis dataset is not to be used for: Sale of carbon credits on community lands without proper FPIC, consent, and mapping exercises.\nIndicative HCS forest maps are probability maps and do not represent HCS Assessments or final maps from implementation of the HCS Landscape and Jurisdicional Approaches methodologies.",
"## Dataset Structure\n\nThis dataset contains a .xls file of field plot data",
"### Curation Rationale\nThe High Carbon Stock Approach (HCSA) core mission is to halt deforestation resulting from commodity production. It is a tool that identifies and conserves natural forests from degraded lands in tropical landscapes, amplifying the role of forest conservation as a nature-based solution to climate change, while at the same time supporting biodiversity conservation, community rights and benefits, and responsible development. To understand the distribution of High Carbon Stock (HCS) forests throughout the landscape and work to protect them in collaboration with smallholder farmers, communities, as well as local and national governments, it is necessary to accurately map these resources in a regional and national scale. Field plot data is an essential step in landscape and jurisdictional implementation of HCS for commodities including Palm Oil, Rubber, and Cocoa.",
"#### Data Collection and Processing\n\n•\tHigh Carbon Stock Approach (HCSA): The dataset is collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (URL \n•\tData Collection Methods: Data was collected in partnership with JKPP (URL and Indonesian civil society organization focusing on community land rights and mapping. General field data collection locations were selected based on access to different landcover types and existing relationships with communities. Initial community consultation and FPIC was conducted by JKPP between April and June 2023. Specific field plots locations were selected based on access and guided by the existing indicative HCS forest map (to provide evaluation data for existing maps and training data for updated classifications). Field plots consist of nested plots using a standardized collection method which is described in detain in HCSA Toolkit Module 4 (URL \n\n\n\n!image/png\n\n!image/png",
"#### Who are the source data producers?\n\nThis data was collected by field teams from JKPP (URL",
"#### Personal and Sensitive Information\n\nThis data does not contain personally identifiable information",
"### Recommendations\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.",
"## Dataset Card Contact\n\ninfo@URL"
] | [
"TAGS\n#license-cc-by-sa-4.0 #arxiv-2107.07431 #region-us \n",
"# Dataset Card for HCSA Forest Plot Data 2023\n\n•\tTitle: High Carbon Stock Approach (HCSA) Forest Plot Data 2023\n•\tDescription: This dataset contains information of forest field plot inventory data collected using the High Carbon Stock Approach (HCSA) methodology. This data was collected to provide validation and training data for large scale indicative HCS forest maps produced with the HCSA Largescale Mapping Framework (URL under a project funded by the GIZ Fair Forward Initiative\n•\tThe data includes various parameters related to land cover, carbon content, tree characteristics, and biomass calculations.",
"## Dataset Details\n\n•\tX Coordinate: The horizontal position of the plot – WGS 84\n•\tY Coordinate: The vertical position of the plot – WGS 84\n•\tLand Cover Update: Updated HCS forest class from plot Carbon stock\n•\tLand Cover Indicatives: Indicative HCS forest class from Lang et al., 2021 URL\n•\tCarbon (ton/ha): Carbon content per hectare in the forest field plot.\n•\tPlot Area (Ha): The total area covered by the forest field plot in hectares (See method figure)\n•\tTree Number: ID of the individual trees in the plot\n•\tDBH (cm): Diameter at Breast Height, measured in centimeters.\n•\tHeight (cm): The height of the trees measured in centimeters.\n•\tHeight (m): The height of the trees converted to meters.\n•\tLocal Name: Common name or local name of the tree species.\n•\tScientific Name: The scientific name of the tree species.\n•\tFamily: Taxonomic family to which the tree species belongs.\n•\tWood Density (g/cm3): The density of wood in grams per cubic centimeter.\n•\tBiomass (kg)/pohon: Biomass per tree measured in kilograms.\n•\tBiomass (ton)/pohon: Biomass per tree converted to metric tons.\n•\tBiomass (ton/ha): Total biomass per hectare calculated from the field plot.",
"### Dataset Description\n\n•\tHigh Carbon Stock Approach (HCSA): The dataset contains forest field plot data collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (URL \n\n\n\n- Curated by: High Carbon Stock Approach\n- Funded by : Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ)\n- Language(s) (NLP): English, Bahasa Indonesia\n- License: cc-by-sa-4.0",
"### Dataset Sources \n\n- Repository: High Carbon Stock Approach - ArcGIS Online -",
"## Uses\n\n•\tPurpose: The dataset is intended for research and analysis related to forest ecology, carbon sequestration, and biodiversity.\n•\tCitation: If used in publications or research, please cite the dataset as follows: High Carbon Stock Approach (2023). Forest Field Plot Data for Indicative HCS Mapping, Indonesia.",
"### Direct Use\n\nThis data is intended to support identification of indicative HCS forests for HCSA Landscape and Jurisdictional approach implementation, for Smallholder Approaches, or a preliminary step in an HCS assessment process.",
"### Out-of-Scope Use\n\nThis dataset is not to be used for: Sale of carbon credits on community lands without proper FPIC, consent, and mapping exercises.\nIndicative HCS forest maps are probability maps and do not represent HCS Assessments or final maps from implementation of the HCS Landscape and Jurisdicional Approaches methodologies.",
"## Dataset Structure\n\nThis dataset contains a .xls file of field plot data",
"### Curation Rationale\nThe High Carbon Stock Approach (HCSA) core mission is to halt deforestation resulting from commodity production. It is a tool that identifies and conserves natural forests from degraded lands in tropical landscapes, amplifying the role of forest conservation as a nature-based solution to climate change, while at the same time supporting biodiversity conservation, community rights and benefits, and responsible development. To understand the distribution of High Carbon Stock (HCS) forests throughout the landscape and work to protect them in collaboration with smallholder farmers, communities, as well as local and national governments, it is necessary to accurately map these resources in a regional and national scale. Field plot data is an essential step in landscape and jurisdictional implementation of HCS for commodities including Palm Oil, Rubber, and Cocoa.",
"#### Data Collection and Processing\n\n•\tHigh Carbon Stock Approach (HCSA): The dataset is collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (URL \n•\tData Collection Methods: Data was collected in partnership with JKPP (URL and Indonesian civil society organization focusing on community land rights and mapping. General field data collection locations were selected based on access to different landcover types and existing relationships with communities. Initial community consultation and FPIC was conducted by JKPP between April and June 2023. Specific field plots locations were selected based on access and guided by the existing indicative HCS forest map (to provide evaluation data for existing maps and training data for updated classifications). Field plots consist of nested plots using a standardized collection method which is described in detain in HCSA Toolkit Module 4 (URL \n\n\n\n!image/png\n\n!image/png",
"#### Who are the source data producers?\n\nThis data was collected by field teams from JKPP (URL",
"#### Personal and Sensitive Information\n\nThis data does not contain personally identifiable information",
"### Recommendations\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.",
"## Dataset Card Contact\n\ninfo@URL"
] |
a4a1c500655b3f37e3bef1e5d97bbc4138229248 |
# Dataset Card for Evaluation run of Xenon1/Xenon-4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Xenon-4](https://huggingface.co/Xenon1/Xenon-4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Xenon-4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T06:47:30.573744](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Xenon-4/blob/main/results_2024-02-04T06-47-30.573744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5969673597339272,
"acc_stderr": 0.0332252879660183,
"acc_norm": 0.6047382841391643,
"acc_norm_stderr": 0.033940427206963365,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.613129800259979,
"mc2_stderr": 0.016329535721420842
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.014555949760496442,
"acc_norm": 0.6015358361774744,
"acc_norm_stderr": 0.014306946052735569
},
"harness|hellaswag|10": {
"acc": 0.6468830910177256,
"acc_stderr": 0.004769618829196511,
"acc_norm": 0.8307110137422824,
"acc_norm_stderr": 0.0037424055874098806
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.03784271932887468,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.03784271932887468
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947559,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947559
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601677,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601677
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552742,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397457,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397457
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7798165137614679,
"acc_stderr": 0.01776597865232753,
"acc_norm": 0.7798165137614679,
"acc_norm_stderr": 0.01776597865232753
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139963,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139963
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153183,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.01465578083749774,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.01465578083749774
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186805,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186805
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144373,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144373
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4132985658409387,
"acc_stderr": 0.012576779494860083,
"acc_norm": 0.4132985658409387,
"acc_norm_stderr": 0.012576779494860083
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.613129800259979,
"mc2_stderr": 0.016329535721420842
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838232
},
"harness|gsm8k|5": {
"acc": 0.20697498104624715,
"acc_stderr": 0.011159498164891772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Xenon1__Xenon-4 | [
"region:us"
] | 2024-02-04T06:49:52+00:00 | {"pretty_name": "Evaluation run of Xenon1/Xenon-4", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/Xenon-4](https://huggingface.co/Xenon1/Xenon-4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Xenon-4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T06:47:30.573744](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Xenon-4/blob/main/results_2024-02-04T06-47-30.573744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5969673597339272,\n \"acc_stderr\": 0.0332252879660183,\n \"acc_norm\": 0.6047382841391643,\n \"acc_norm_stderr\": 0.033940427206963365,\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.613129800259979,\n \"mc2_stderr\": 0.016329535721420842\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.014555949760496442,\n \"acc_norm\": 0.6015358361774744,\n \"acc_norm_stderr\": 0.014306946052735569\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6468830910177256,\n \"acc_stderr\": 0.004769618829196511,\n \"acc_norm\": 0.8307110137422824,\n \"acc_norm_stderr\": 0.0037424055874098806\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.03784271932887468,\n \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.03784271932887468\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947559,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947559\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601677,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601677\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n \"acc_stderr\": 0.026985289576552742,\n \"acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.026985289576552742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397457,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397457\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7798165137614679,\n \"acc_stderr\": 0.01776597865232753,\n \"acc_norm\": 0.7798165137614679,\n \"acc_norm_stderr\": 0.01776597865232753\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293433,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293433\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n \"acc_stderr\": 0.014711684386139963,\n \"acc_norm\": 0.7841634738186463,\n \"acc_norm_stderr\": 0.014711684386139963\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153183,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n \"acc_stderr\": 0.01465578083749774,\n \"acc_norm\": 0.25921787709497207,\n \"acc_norm_stderr\": 0.01465578083749774\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186805,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186805\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144373,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144373\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4132985658409387,\n \"acc_stderr\": 0.012576779494860083,\n \"acc_norm\": 0.4132985658409387,\n \"acc_norm_stderr\": 0.012576779494860083\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.613129800259979,\n \"mc2_stderr\": 0.016329535721420842\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838232\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20697498104624715,\n \"acc_stderr\": 0.011159498164891772\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/Xenon-4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T06-47-30.573744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["**/details_harness|winogrande|5_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T06-47-30.573744.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T06_47_30.573744", "path": ["results_2024-02-04T06-47-30.573744.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T06-47-30.573744.parquet"]}]}]} | 2024-02-04T06:50:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Xenon1/Xenon-4
Dataset automatically created during the evaluation run of model Xenon1/Xenon-4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T06:47:30.573744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Xenon1/Xenon-4\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Xenon-4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:47:30.573744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Xenon1/Xenon-4\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Xenon-4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T06:47:30.573744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d04f6d26bab031e2f2f10b2f3e45604d3283f934 | This dataset based on https://www.kaggle.com/code/danofer/reddit-comments-scores-nlp/
The moderation dataset includes only 70 thousand rows
35 negative and 35 positive comments | ifmain/text-moderation | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:en",
"region:us"
] | 2024-02-04T06:52:31+00:00 | {"language": ["en"], "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"], "pretty_name": "Text Moderation"} | 2024-02-04T07:05:29+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #size_categories-10K<n<100K #language-English #region-us
| This dataset based on URL
The moderation dataset includes only 70 thousand rows
35 negative and 35 positive comments | [] | [
"TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-English #region-us \n"
] |
9581b126461d95f8c0e923e2f43d0a8daeec00b3 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This parallel corpus dataset contains about 21k rows of parallel English and Spanish texts obtained by crawling different websites. It has been filtered strictly.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
This is a parallel corpus of bilingual texts crawled from multilingual websites, which contains 21, 005 TUs. A strict validation process has been followed, which resulted in discarding:
- TUs from crawled websites that do not comply to the PSI directive,
- TUs with more than 99% of mispelled tokens,
- TUs identified during the manual validation process and all the TUs from websites which error rate in the sample extracted for manual validation is strictly above the following thresholds: 50% of TUs with language identification errors, 50% of TUs with alignment errors, 50% of TUs with tokenization errors, 20% of TUs identified as machine translated content, 50% of TUs with translation errors.
- **Period of crawling:** 15/11/2016 - 23/01/2017 (DD/MM/YY).
- **Curated by:** Directorate-General for Communications Networks, Content and Technology.
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** English & Spanish
- **License:** cc-by-4.0
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** http://data.europa.eu/88u/dataset/elrc_339
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
This dataset is perfect for training Machine Translation algorithms.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | Thermostatic/parallel_corpus_webcrawl_english_spanish_1 | [
"task_categories:translation",
"size_categories:10K<n<100K",
"language:en",
"language:es",
"license:cc-by-4.0",
"English",
"Spanish",
"Parallel corpus",
"region:us"
] | 2024-02-04T06:54:58+00:00 | {"language": ["en", "es"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["translation"], "tags": ["English", "Spanish", "Parallel corpus"]} | 2024-02-04T07:20:58+00:00 | [] | [
"en",
"es"
] | TAGS
#task_categories-translation #size_categories-10K<n<100K #language-English #language-Spanish #license-cc-by-4.0 #English #Spanish #Parallel corpus #region-us
|
# Dataset Card for Dataset Name
This parallel corpus dataset contains about 21k rows of parallel English and Spanish texts obtained by crawling different websites. It has been filtered strictly.
## Dataset Details
### Dataset Description
This is a parallel corpus of bilingual texts crawled from multilingual websites, which contains 21, 005 TUs. A strict validation process has been followed, which resulted in discarding:
- TUs from crawled websites that do not comply to the PSI directive,
- TUs with more than 99% of mispelled tokens,
- TUs identified during the manual validation process and all the TUs from websites which error rate in the sample extracted for manual validation is strictly above the following thresholds: 50% of TUs with language identification errors, 50% of TUs with alignment errors, 50% of TUs with tokenization errors, 20% of TUs identified as machine translated content, 50% of TUs with translation errors.
- Period of crawling: 15/11/2016 - 23/01/2017 (DD/MM/YY).
- Curated by: Directorate-General for Communications Networks, Content and Technology.
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP): English & Spanish
- License: cc-by-4.0
### Dataset Sources [optional]
- Repository: URL
- Paper [optional]:
- Demo [optional]:
## Uses
This dataset is perfect for training Machine Translation algorithms.
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis parallel corpus dataset contains about 21k rows of parallel English and Spanish texts obtained by crawling different websites. It has been filtered strictly.",
"## Dataset Details",
"### Dataset Description\n\n\n\nThis is a parallel corpus of bilingual texts crawled from multilingual websites, which contains 21, 005 TUs. A strict validation process has been followed, which resulted in discarding:\n\n - TUs from crawled websites that do not comply to the PSI directive,\n - TUs with more than 99% of mispelled tokens,\n - TUs identified during the manual validation process and all the TUs from websites which error rate in the sample extracted for manual validation is strictly above the following thresholds: 50% of TUs with language identification errors, 50% of TUs with alignment errors, 50% of TUs with tokenization errors, 20% of TUs identified as machine translated content, 50% of TUs with translation errors.\n\n- Period of crawling: 15/11/2016 - 23/01/2017 (DD/MM/YY).\n- Curated by: Directorate-General for Communications Networks, Content and Technology.\n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): English & Spanish\n- License: cc-by-4.0",
"### Dataset Sources [optional]\n\n\n\n- Repository: URL\n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\n\n\nThis dataset is perfect for training Machine Translation algorithms.",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-translation #size_categories-10K<n<100K #language-English #language-Spanish #license-cc-by-4.0 #English #Spanish #Parallel corpus #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis parallel corpus dataset contains about 21k rows of parallel English and Spanish texts obtained by crawling different websites. It has been filtered strictly.",
"## Dataset Details",
"### Dataset Description\n\n\n\nThis is a parallel corpus of bilingual texts crawled from multilingual websites, which contains 21, 005 TUs. A strict validation process has been followed, which resulted in discarding:\n\n - TUs from crawled websites that do not comply to the PSI directive,\n - TUs with more than 99% of mispelled tokens,\n - TUs identified during the manual validation process and all the TUs from websites which error rate in the sample extracted for manual validation is strictly above the following thresholds: 50% of TUs with language identification errors, 50% of TUs with alignment errors, 50% of TUs with tokenization errors, 20% of TUs identified as machine translated content, 50% of TUs with translation errors.\n\n- Period of crawling: 15/11/2016 - 23/01/2017 (DD/MM/YY).\n- Curated by: Directorate-General for Communications Networks, Content and Technology.\n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): English & Spanish\n- License: cc-by-4.0",
"### Dataset Sources [optional]\n\n\n\n- Repository: URL\n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\n\n\nThis dataset is perfect for training Machine Translation algorithms.",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9b398e354ee43f304407aa9da41664a42bafd8fc | created a total of 5 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.5644362568855286 mean: 3.9444944858551025
jlbaker361/ddpo-stability-dcgan-e5 std: 0.20163671672344208 mean: 3.857413673400879
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.18634916841983795 mean: 3.7780341148376464
jlbaker361/ddpo-stability-e5 std: 0.38460996747016907 mean: 4.132036781311035 | jlbaker361/stability-ddpo-evaluation-scale-0.5 | [
"region:us"
] | 2024-02-04T07:20:34+00:00 | {} | 2024-02-04T07:20:38+00:00 | [] | [] | TAGS
#region-us
| created a total of 5 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.5644362568855286 mean: 3.9444944858551025
jlbaker361/ddpo-stability-dcgan-e5 std: 0.20163671672344208 mean: 3.857413673400879
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.18634916841983795 mean: 3.7780341148376464
jlbaker361/ddpo-stability-e5 std: 0.38460996747016907 mean: 4.132036781311035 | [] | [
"TAGS\n#region-us \n"
] |
f5ad97a173c44f151aa7afbf489624eb2bc8e216 |
# Dataset Card for Evaluation run of SeaLLMs/SeaLLM-7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SeaLLMs/SeaLLM-7B-v2](https://huggingface.co/SeaLLMs/SeaLLM-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SeaLLMs__SeaLLM-7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T07:21:52.020377](https://huggingface.co/datasets/open-llm-leaderboard/details_SeaLLMs__SeaLLM-7B-v2/blob/main/results_2024-02-04T07-21-52.020377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6219975370253238,
"acc_stderr": 0.03255711097476003,
"acc_norm": 0.6223954657019166,
"acc_norm_stderr": 0.033228262382024386,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276778,
"mc2": 0.5110807878391984,
"mc2_stderr": 0.014899343882305238
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6176060545708026,
"acc_stderr": 0.004849788423944365,
"acc_norm": 0.8232423819956184,
"acc_norm_stderr": 0.003806838448161734
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895507,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895507
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099867,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099867
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.015318257745976706,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.015318257745976706
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.012716941720734806,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.012716941720734806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.019412539242032168,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.019412539242032168
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276778,
"mc2": 0.5110807878391984,
"mc2_stderr": 0.014899343882305238
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.01143045004588158
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.012740305717376268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SeaLLMs__SeaLLM-7B-v2 | [
"region:us"
] | 2024-02-04T07:24:12+00:00 | {"pretty_name": "Evaluation run of SeaLLMs/SeaLLM-7B-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [SeaLLMs/SeaLLM-7B-v2](https://huggingface.co/SeaLLMs/SeaLLM-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SeaLLMs__SeaLLM-7B-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T07:21:52.020377](https://huggingface.co/datasets/open-llm-leaderboard/details_SeaLLMs__SeaLLM-7B-v2/blob/main/results_2024-02-04T07-21-52.020377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6219975370253238,\n \"acc_stderr\": 0.03255711097476003,\n \"acc_norm\": 0.6223954657019166,\n \"acc_norm_stderr\": 0.033228262382024386,\n \"mc1\": 0.34394124847001223,\n \"mc1_stderr\": 0.016629087514276778,\n \"mc2\": 0.5110807878391984,\n \"mc2_stderr\": 0.014899343882305238\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.01448470304885736,\n \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6176060545708026,\n \"acc_stderr\": 0.004849788423944365,\n \"acc_norm\": 0.8232423819956184,\n \"acc_norm_stderr\": 0.003806838448161734\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895507,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895507\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099867,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099867\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n \"acc_stderr\": 0.015318257745976706,\n \"acc_norm\": 0.2994413407821229,\n \"acc_norm_stderr\": 0.015318257745976706\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n \"acc_stderr\": 0.012716941720734806,\n \"acc_norm\": 0.45436766623207303,\n \"acc_norm_stderr\": 0.012716941720734806\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.019412539242032168,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.019412539242032168\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n \"mc1_stderr\": 0.016629087514276778,\n \"mc2\": 0.5110807878391984,\n \"mc2_stderr\": 0.014899343882305238\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.01143045004588158\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \"acc_stderr\": 0.012740305717376268\n }\n}\n```", "repo_url": "https://huggingface.co/SeaLLMs/SeaLLM-7B-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|arc:challenge|25_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|gsm8k|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hellaswag|10_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T07-21-52.020377.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["**/details_harness|winogrande|5_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T07-21-52.020377.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T07_21_52.020377", "path": ["results_2024-02-04T07-21-52.020377.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T07-21-52.020377.parquet"]}]}]} | 2024-02-04T07:24:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of SeaLLMs/SeaLLM-7B-v2
Dataset automatically created during the evaluation run of model SeaLLMs/SeaLLM-7B-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T07:21:52.020377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of SeaLLMs/SeaLLM-7B-v2\n\n\n\nDataset automatically created during the evaluation run of model SeaLLMs/SeaLLM-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T07:21:52.020377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SeaLLMs/SeaLLM-7B-v2\n\n\n\nDataset automatically created during the evaluation run of model SeaLLMs/SeaLLM-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T07:21:52.020377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7fb2ccb447e90b9ea2674abc7858ab981ceccdd2 |
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T07:27:05.423264](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO/blob/main/results_2024-02-04T07-27-05.423264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.639538124544364,
"acc_stderr": 0.032278452644941634,
"acc_norm": 0.6420765688344944,
"acc_norm_stderr": 0.0329201960227087,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5290826380707278,
"mc2_stderr": 0.015286081575472645
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.014235872487909869,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620453
},
"harness|hellaswag|10": {
"acc": 0.6558454491137223,
"acc_stderr": 0.004741208229092874,
"acc_norm": 0.8462457677753435,
"acc_norm_stderr": 0.003599758043546812
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.0133878957315436,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.0133878957315436
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3139664804469274,
"acc_stderr": 0.015521923933523635,
"acc_norm": 0.3139664804469274,
"acc_norm_stderr": 0.015521923933523635
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083131,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162662,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162662
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744543,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744543
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5290826380707278,
"mc2_stderr": 0.015286081575472645
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.579226686884003,
"acc_stderr": 0.013598489497182838
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO | [
"region:us"
] | 2024-02-04T07:29:27+00:00 | {"pretty_name": "Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T07:27:05.423264](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO/blob/main/results_2024-02-04T07-27-05.423264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.639538124544364,\n \"acc_stderr\": 0.032278452644941634,\n \"acc_norm\": 0.6420765688344944,\n \"acc_norm_stderr\": 0.0329201960227087,\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5290826380707278,\n \"mc2_stderr\": 0.015286081575472645\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.014235872487909869,\n \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620453\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6558454491137223,\n \"acc_stderr\": 0.004741208229092874,\n \"acc_norm\": 0.8462457677753435,\n \"acc_norm_stderr\": 0.003599758043546812\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.0133878957315436,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.0133878957315436\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3139664804469274,\n \"acc_stderr\": 0.015521923933523635,\n \"acc_norm\": 0.3139664804469274,\n \"acc_norm_stderr\": 0.015521923933523635\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083131,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083131\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162662,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162662\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744543,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744543\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5290826380707278,\n \"mc2_stderr\": 0.015286081575472645\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.579226686884003,\n \"acc_stderr\": 0.013598489497182838\n }\n}\n```", "repo_url": "https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|arc:challenge|25_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|gsm8k|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hellaswag|10_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T07-27-05.423264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["**/details_harness|winogrande|5_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T07-27-05.423264.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T07_27_05.423264", "path": ["results_2024-02-04T07-27-05.423264.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T07-27-05.423264.parquet"]}]}]} | 2024-02-04T07:29:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO
Dataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T07:27:05.423264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T07:27:05.423264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T07:27:05.423264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
889d33810d2a387084140eb5da7d31d53622467c | created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.3504577875137329 mean: 3.8840092182159425 inception_mean: 3.00921630859375 inception_src: 0.5300262570381165
jlbaker361/ddpo-stability-dcgan-e5 std: 0.3601852357387543 mean: 3.8566782760620115 inception_mean: 4.235417366027832 inception_src: 0.29068735241889954
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.4087407886981964 mean: 3.860313024520874 inception_mean: 4.847194194793701 inception_src: 0.899190366268158
jlbaker361/ddpo-stability-e5 std: 0.34105899930000305 mean: 3.891744394302368 inception_mean: 5.334989547729492 inception_src: 0.6441727876663208 | jlbaker361/stability-ddpo-evaluation-scale-0.25 | [
"region:us"
] | 2024-02-04T07:32:56+00:00 | {} | 2024-02-07T13:38:10+00:00 | [] | [] | TAGS
#region-us
| created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.3504577875137329 mean: 3.8840092182159425 inception_mean: 3.00921630859375 inception_src: 0.5300262570381165
jlbaker361/ddpo-stability-dcgan-e5 std: 0.3601852357387543 mean: 3.8566782760620115 inception_mean: 4.235417366027832 inception_src: 0.29068735241889954
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.4087407886981964 mean: 3.860313024520874 inception_mean: 4.847194194793701 inception_src: 0.899190366268158
jlbaker361/ddpo-stability-e5 std: 0.34105899930000305 mean: 3.891744394302368 inception_mean: 5.334989547729492 inception_src: 0.6441727876663208 | [] | [
"TAGS\n#region-us \n"
] |
182b816e0503f436a9c495104ac4b099b8313562 | created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.4188942611217499 mean: 3.71910099029541 inception_mean: 3.1958298683166504 inception_src: 0.3549079895019531
jlbaker361/ddpo-stability-dcgan-e5 std: 0.3745228052139282 mean: 3.838933000564575 inception_mean: 4.484759330749512 inception_src: 0.6912834644317627
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.40060076117515564 mean: 3.895979642868042 inception_mean: 5.08394718170166 inception_src: 0.8122329711914062
jlbaker361/ddpo-stability-e5 std: 0.3600755035877228 mean: 3.9948656940460205 inception_mean: 5.5453715324401855 inception_src: 0.9075482487678528 | jlbaker361/stability-ddpo-evaluation-scale-0.75 | [
"region:us"
] | 2024-02-04T07:45:19+00:00 | {} | 2024-02-06T20:43:01+00:00 | [] | [] | TAGS
#region-us
| created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.4188942611217499 mean: 3.71910099029541 inception_mean: 3.1958298683166504 inception_src: 0.3549079895019531
jlbaker361/ddpo-stability-dcgan-e5 std: 0.3745228052139282 mean: 3.838933000564575 inception_mean: 4.484759330749512 inception_src: 0.6912834644317627
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.40060076117515564 mean: 3.895979642868042 inception_mean: 5.08394718170166 inception_src: 0.8122329711914062
jlbaker361/ddpo-stability-e5 std: 0.3600755035877228 mean: 3.9948656940460205 inception_mean: 5.5453715324401855 inception_src: 0.9075482487678528 | [] | [
"TAGS\n#region-us \n"
] |
c0419caf1ce4a4193ed7864fe21083175e475faf |
# Dataset Card for Evaluation run of vikash06/doctorLLM10k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vikash06/doctorLLM10k](https://huggingface.co/vikash06/doctorLLM10k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vikash06__doctorLLM10k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T07:43:07.963354](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorLLM10k/blob/main/results_2024-02-04T07-43-07.963354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44666273408360996,
"acc_stderr": 0.034414012934586825,
"acc_norm": 0.4518038323283762,
"acc_norm_stderr": 0.03520583214811468,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.44761845682432344,
"mc2_stderr": 0.015236583455591498
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.014602878388536597,
"acc_norm": 0.5494880546075085,
"acc_norm_stderr": 0.014539646098471625
},
"harness|hellaswag|10": {
"acc": 0.6201951802429795,
"acc_stderr": 0.0048434625459435,
"acc_norm": 0.7994423421629158,
"acc_norm_stderr": 0.003995992960088763
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.040260970832965585,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.040260970832965585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4528301886792453,
"acc_stderr": 0.030635627957961823,
"acc_norm": 0.4528301886792453,
"acc_norm_stderr": 0.030635627957961823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.041049472699033945,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.041049472699033945
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47419354838709676,
"acc_stderr": 0.028406095057653315,
"acc_norm": 0.47419354838709676,
"acc_norm_stderr": 0.028406095057653315
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.503030303030303,
"acc_stderr": 0.03904272341431856,
"acc_norm": 0.503030303030303,
"acc_norm_stderr": 0.03904272341431856
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.0356071651653106,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.0356071651653106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6217616580310881,
"acc_stderr": 0.034998072761933376,
"acc_norm": 0.6217616580310881,
"acc_norm_stderr": 0.034998072761933376
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4461538461538462,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.4461538461538462,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.0259288761327661,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.0259288761327661
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6018348623853211,
"acc_stderr": 0.020987989422654264,
"acc_norm": 0.6018348623853211,
"acc_norm_stderr": 0.020987989422654264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.035077938347913236,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.035077938347913236
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.032148146302403695,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.032148146302403695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.45739910313901344,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.45739910313901344,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4662576687116564,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.4662576687116564,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.04950504382128921,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.04950504382128921
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6155810983397191,
"acc_stderr": 0.01739568874281962,
"acc_norm": 0.6155810983397191,
"acc_norm_stderr": 0.01739568874281962
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.0269150473553698,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.0269150473553698
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364562,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364562
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.028624412550167965,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.028624412550167965
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.02832032583010591,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.02832032583010591
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44753086419753085,
"acc_stderr": 0.027667138569422697,
"acc_norm": 0.44753086419753085,
"acc_norm_stderr": 0.027667138569422697
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3546284224250326,
"acc_stderr": 0.012218576439090167,
"acc_norm": 0.3546284224250326,
"acc_norm_stderr": 0.012218576439090167
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4199346405228758,
"acc_stderr": 0.019966811178256483,
"acc_norm": 0.4199346405228758,
"acc_norm_stderr": 0.019966811178256483
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.47346938775510206,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.47346938775510206,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5970149253731343,
"acc_stderr": 0.034683432951111266,
"acc_norm": 0.5970149253731343,
"acc_norm_stderr": 0.034683432951111266
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.03546976959393162,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.03546976959393162
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.44761845682432344,
"mc2_stderr": 0.015236583455591498
},
"harness|winogrande|5": {
"acc": 0.7000789265982637,
"acc_stderr": 0.01287834752663607
},
"harness|gsm8k|5": {
"acc": 0.10159211523881728,
"acc_stderr": 0.008321642868474812
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vikash06__doctorLLM10k | [
"region:us"
] | 2024-02-04T07:45:34+00:00 | {"pretty_name": "Evaluation run of vikash06/doctorLLM10k", "dataset_summary": "Dataset automatically created during the evaluation run of model [vikash06/doctorLLM10k](https://huggingface.co/vikash06/doctorLLM10k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vikash06__doctorLLM10k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T07:43:07.963354](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorLLM10k/blob/main/results_2024-02-04T07-43-07.963354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44666273408360996,\n \"acc_stderr\": 0.034414012934586825,\n \"acc_norm\": 0.4518038323283762,\n \"acc_norm_stderr\": 0.03520583214811468,\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.44761845682432344,\n \"mc2_stderr\": 0.015236583455591498\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536597,\n \"acc_norm\": 0.5494880546075085,\n \"acc_norm_stderr\": 0.014539646098471625\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6201951802429795,\n \"acc_stderr\": 0.0048434625459435,\n \"acc_norm\": 0.7994423421629158,\n \"acc_norm_stderr\": 0.003995992960088763\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.040260970832965585,\n \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.040260970832965585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4528301886792453,\n \"acc_stderr\": 0.030635627957961823,\n \"acc_norm\": 0.4528301886792453,\n \"acc_norm_stderr\": 0.030635627957961823\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730575,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730575\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.041049472699033945,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.041049472699033945\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.47419354838709676,\n \"acc_stderr\": 0.028406095057653315,\n \"acc_norm\": 0.47419354838709676,\n \"acc_norm_stderr\": 0.028406095057653315\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.503030303030303,\n \"acc_stderr\": 0.03904272341431856,\n \"acc_norm\": 0.503030303030303,\n \"acc_norm_stderr\": 0.03904272341431856\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.0356071651653106,\n \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.0356071651653106\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6217616580310881,\n \"acc_stderr\": 0.034998072761933376,\n \"acc_norm\": 0.6217616580310881,\n \"acc_norm_stderr\": 0.034998072761933376\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4461538461538462,\n \"acc_stderr\": 0.02520357177302833,\n \"acc_norm\": 0.4461538461538462,\n \"acc_norm_stderr\": 0.02520357177302833\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.0259288761327661,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.0259288761327661\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478465,\n \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478465\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6018348623853211,\n \"acc_stderr\": 0.020987989422654264,\n \"acc_norm\": 0.6018348623853211,\n \"acc_norm_stderr\": 0.020987989422654264\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.035077938347913236,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.035077938347913236\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5780590717299579,\n \"acc_stderr\": 0.032148146302403695,\n \"acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.032148146302403695\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.45739910313901344,\n \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.45739910313901344,\n \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.043851623256015534,\n \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.043851623256015534\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4662576687116564,\n \"acc_stderr\": 0.039194155450484096,\n \"acc_norm\": 0.4662576687116564,\n \"acc_norm_stderr\": 0.039194155450484096\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.04950504382128921,\n \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.04950504382128921\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6155810983397191,\n \"acc_stderr\": 0.01739568874281962,\n \"acc_norm\": 0.6155810983397191,\n \"acc_norm_stderr\": 0.01739568874281962\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.0269150473553698,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.0269150473553698\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n \"acc_stderr\": 0.014696599650364562,\n \"acc_norm\": 0.26145251396648045,\n \"acc_norm_stderr\": 0.014696599650364562\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.028624412550167965,\n \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.028624412550167965\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n \"acc_stderr\": 0.02832032583010591,\n \"acc_norm\": 0.5369774919614148,\n \"acc_norm_stderr\": 0.02832032583010591\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.027667138569422697,\n \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.027667138569422697\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3546284224250326,\n \"acc_stderr\": 0.012218576439090167,\n \"acc_norm\": 0.3546284224250326,\n \"acc_norm_stderr\": 0.012218576439090167\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4199346405228758,\n \"acc_stderr\": 0.019966811178256483,\n \"acc_norm\": 0.4199346405228758,\n \"acc_norm_stderr\": 0.019966811178256483\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5970149253731343,\n \"acc_stderr\": 0.034683432951111266,\n \"acc_norm\": 0.5970149253731343,\n \"acc_norm_stderr\": 0.034683432951111266\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.03546976959393162,\n \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.03546976959393162\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.44761845682432344,\n \"mc2_stderr\": 0.015236583455591498\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7000789265982637,\n \"acc_stderr\": 0.01287834752663607\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10159211523881728,\n \"acc_stderr\": 0.008321642868474812\n }\n}\n```", "repo_url": "https://huggingface.co/vikash06/doctorLLM10k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|arc:challenge|25_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|gsm8k|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hellaswag|10_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T07-43-07.963354.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["**/details_harness|winogrande|5_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T07-43-07.963354.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T07_43_07.963354", "path": ["results_2024-02-04T07-43-07.963354.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T07-43-07.963354.parquet"]}]}]} | 2024-02-04T07:45:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vikash06/doctorLLM10k
Dataset automatically created during the evaluation run of model vikash06/doctorLLM10k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T07:43:07.963354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vikash06/doctorLLM10k\n\n\n\nDataset automatically created during the evaluation run of model vikash06/doctorLLM10k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T07:43:07.963354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vikash06/doctorLLM10k\n\n\n\nDataset automatically created during the evaluation run of model vikash06/doctorLLM10k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T07:43:07.963354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d12e0c0a413fdbe793467cb02a89f8e6d415c459 | created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.3620029091835022 mean: 3.8459116649627685 inception_mean: 3.0659842491149902 inception_src: 0.6550269722938538
jlbaker361/ddpo-stability-dcgan-e5 std: 0.3346726894378662 mean: 3.8821122550964358 inception_mean: 4.262051105499268 inception_src: 1.106516718864441
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.3248203992843628 mean: 3.887494239807129 inception_mean: 5.007357120513916 inception_src: 0.8150985240936279
jlbaker361/ddpo-stability-e5 std: 0.36267760396003723 mean: 3.8799888992309572 inception_mean: 5.574629783630371 inception_src: 0.9062372446060181 | jlbaker361/stability-ddpo-evaluation-scale-1.0 | [
"region:us"
] | 2024-02-04T07:45:50+00:00 | {} | 2024-02-06T17:56:39+00:00 | [] | [] | TAGS
#region-us
| created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.3620029091835022 mean: 3.8459116649627685 inception_mean: 3.0659842491149902 inception_src: 0.6550269722938538
jlbaker361/ddpo-stability-dcgan-e5 std: 0.3346726894378662 mean: 3.8821122550964358 inception_mean: 4.262051105499268 inception_src: 1.106516718864441
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.3248203992843628 mean: 3.887494239807129 inception_mean: 5.007357120513916 inception_src: 0.8150985240936279
jlbaker361/ddpo-stability-e5 std: 0.36267760396003723 mean: 3.8799888992309572 inception_mean: 5.574629783630371 inception_src: 0.9062372446060181 | [] | [
"TAGS\n#region-us \n"
] |
74c0d1782a556c5deb2f43da461b328532a5c26d | created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.35849377512931824 mean: 3.823330249786377 inception_mean: 2.998213291168213 inception_src: 0.24159207940101624
jlbaker361/ddpo-stability-dcgan-e5 std: 0.32244443893432617 mean: 3.828410129547119 inception_mean: 4.1628241539001465 inception_src: 0.3664723336696625
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.3253556191921234 mean: 3.8177423000335695 inception_mean: 5.146191596984863 inception_src: 0.6615503430366516
jlbaker361/ddpo-stability-e5 std: 0.31581810116767883 mean: 3.988733243942261 inception_mean: 5.774373531341553 inception_src: 0.9121077656745911 | jlbaker361/stability-ddpo-evaluation-scale-0.9 | [
"region:us"
] | 2024-02-04T07:49:05+00:00 | {} | 2024-02-06T20:48:40+00:00 | [] | [] | TAGS
#region-us
| created a total of 50 images
jlbaker361/ddpo-stability-dcgan-e5-CONDITIONAL std: 0.35849377512931824 mean: 3.823330249786377 inception_mean: 2.998213291168213 inception_src: 0.24159207940101624
jlbaker361/ddpo-stability-dcgan-e5 std: 0.32244443893432617 mean: 3.828410129547119 inception_mean: 4.1628241539001465 inception_src: 0.3664723336696625
jlbaker361/ddpo-stability-e5-CONDITIONAL std: 0.3253556191921234 mean: 3.8177423000335695 inception_mean: 5.146191596984863 inception_src: 0.6615503430366516
jlbaker361/ddpo-stability-e5 std: 0.31581810116767883 mean: 3.988733243942261 inception_mean: 5.774373531341553 inception_src: 0.9121077656745911 | [] | [
"TAGS\n#region-us \n"
] |
28e517098ce750fde4b3bc9941ecaa2832f6971d |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
A massive parallel corpus of English-Spanish pairs. It hasn't a specified license, but there doesn't seem to be any copyrighted material in the corpus.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Philipp Koehn
- **Funded by [optional]:** In part funded by the European Commission (7th Framework Programme).
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** English & Spanish
- **License:** Not specified.
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://www.statmt.org/europarl/index.html
- **Paper [optional]:** https://aclanthology.org/2005.mtsummit-papers.11.pdf
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | Thermostatic/parallel_corpus_europarl_english_spanish | [
"task_categories:translation",
"size_categories:1M<n<10M",
"language:en",
"language:es",
"English",
"Spanish",
"Parallel Corpus",
"region:us"
] | 2024-02-04T08:04:07+00:00 | {"language": ["en", "es"], "size_categories": ["1M<n<10M"], "task_categories": ["translation"], "pretty_name": "Europarl", "tags": ["English", "Spanish", "Parallel Corpus"]} | 2024-02-04T08:25:35+00:00 | [] | [
"en",
"es"
] | TAGS
#task_categories-translation #size_categories-1M<n<10M #language-English #language-Spanish #English #Spanish #Parallel Corpus #region-us
|
# Dataset Card for Dataset Name
A massive parallel corpus of English-Spanish pairs. It hasn't a specified license, but there doesn't seem to be any copyrighted material in the corpus.
## Dataset Details
### Dataset Description
- Curated by: Philipp Koehn
- Funded by [optional]: In part funded by the European Commission (7th Framework Programme).
- Shared by [optional]:
- Language(s) (NLP): English & Spanish
- License: Not specified.
### Dataset Sources [optional]
- Repository: URL
- Paper [optional]: URL
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nA massive parallel corpus of English-Spanish pairs. It hasn't a specified license, but there doesn't seem to be any copyrighted material in the corpus.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: Philipp Koehn\n- Funded by [optional]: In part funded by the European Commission (7th Framework Programme).\n- Shared by [optional]: \n- Language(s) (NLP): English & Spanish\n- License: Not specified.",
"### Dataset Sources [optional]\n\n\n\n- Repository: URL\n- Paper [optional]: URL\n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-translation #size_categories-1M<n<10M #language-English #language-Spanish #English #Spanish #Parallel Corpus #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nA massive parallel corpus of English-Spanish pairs. It hasn't a specified license, but there doesn't seem to be any copyrighted material in the corpus.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: Philipp Koehn\n- Funded by [optional]: In part funded by the European Commission (7th Framework Programme).\n- Shared by [optional]: \n- Language(s) (NLP): English & Spanish\n- License: Not specified.",
"### Dataset Sources [optional]\n\n\n\n- Repository: URL\n- Paper [optional]: URL\n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
56ea94355ac31a32f2b062b25229ddc1405c714b |
# Databricks-dolly
This is a cleansed version of [databricks/databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k)
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("Sharathhebbar24/databricks-dolly-15k", split="train")
``` | Sharathhebbar24/databricks-dolly-15k | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-04T08:12:27+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "pretty_name": "dolly", "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12664945, "num_examples": 15011}], "download_size": 7368629, "dataset_size": 12664945}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T14:03:06+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
|
# Databricks-dolly
This is a cleansed version of databricks/databricks-dolly-15k
## Usage
| [
"# Databricks-dolly\n\nThis is a cleansed version of databricks/databricks-dolly-15k",
"## Usage"
] | [
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n",
"# Databricks-dolly\n\nThis is a cleansed version of databricks/databricks-dolly-15k",
"## Usage"
] |
5212fda15e1eb7b11b38e0eaf83899a9c497f454 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | asier86/certifs_tfdm | [
"license:unknown",
"region:us"
] | 2024-02-04T08:16:55+00:00 | {"license": "unknown"} | 2024-02-04T08:28:32+00:00 | [] | [] | TAGS
#license-unknown #region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#license-unknown #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
bdc9dc949cceb7fb5f8fc35d6e7dd25980a34b8b |
# Dataset Card for Evaluation run of BarraHome/zephyr-dpo-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarraHome/zephyr-dpo-v2](https://huggingface.co/BarraHome/zephyr-dpo-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarraHome__zephyr-dpo-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T08:58:25.311637](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__zephyr-dpo-v2/blob/main/results_2024-02-04T08-58-25.311637.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5842601909932569,
"acc_stderr": 0.033311942698808106,
"acc_norm": 0.5900864037887772,
"acc_norm_stderr": 0.03399934210645472,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5616226683920723,
"mc2_stderr": 0.015980395758532336
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633827,
"acc_norm": 0.5784982935153583,
"acc_norm_stderr": 0.014430197069326023
},
"harness|hellaswag|10": {
"acc": 0.6367257518422625,
"acc_stderr": 0.004799599840397376,
"acc_norm": 0.8272256522605059,
"acc_norm_stderr": 0.0037727944471851503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764812,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764812
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803638,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803638
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.03186608121408832,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.03186608121408832
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7798165137614679,
"acc_stderr": 0.01776597865232756,
"acc_norm": 0.7798165137614679,
"acc_norm_stderr": 0.01776597865232756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.033769221512523366,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.033769221512523366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.046533331469736455,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.046533331469736455
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503949,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503949
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937153,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937153
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277892,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277892
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2670391061452514,
"acc_stderr": 0.014796502622562557,
"acc_norm": 0.2670391061452514,
"acc_norm_stderr": 0.014796502622562557
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885996,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885996
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085637,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296014,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5616226683920723,
"mc2_stderr": 0.015980395758532336
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759989
},
"harness|gsm8k|5": {
"acc": 0.3025018953752843,
"acc_stderr": 0.012652544133186141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BarraHome__zephyr-dpo-v2 | [
"region:us"
] | 2024-02-04T09:00:14+00:00 | {"pretty_name": "Evaluation run of BarraHome/zephyr-dpo-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarraHome/zephyr-dpo-v2](https://huggingface.co/BarraHome/zephyr-dpo-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarraHome__zephyr-dpo-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T08:58:25.311637](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__zephyr-dpo-v2/blob/main/results_2024-02-04T08-58-25.311637.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5842601909932569,\n \"acc_stderr\": 0.033311942698808106,\n \"acc_norm\": 0.5900864037887772,\n \"acc_norm_stderr\": 0.03399934210645472,\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5616226683920723,\n \"mc2_stderr\": 0.015980395758532336\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633827,\n \"acc_norm\": 0.5784982935153583,\n \"acc_norm_stderr\": 0.014430197069326023\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6367257518422625,\n \"acc_stderr\": 0.004799599840397376,\n \"acc_norm\": 0.8272256522605059,\n \"acc_norm_stderr\": 0.0037727944471851503\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764812,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764812\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803638,\n \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803638\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.03186608121408832,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.03186608121408832\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7798165137614679,\n \"acc_stderr\": 0.01776597865232756,\n \"acc_norm\": 0.7798165137614679,\n \"acc_norm_stderr\": 0.01776597865232756\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.033769221512523366,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.033769221512523366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.046533331469736455,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.046533331469736455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503949,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503949\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.014927447101937153,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.014927447101937153\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277892,\n \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277892\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n \"acc_stderr\": 0.014796502622562557,\n \"acc_norm\": 0.2670391061452514,\n \"acc_norm_stderr\": 0.014796502622562557\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.026624152478845853,\n \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.026624152478845853\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n \"acc_stderr\": 0.012620785155885996,\n \"acc_norm\": 0.423728813559322,\n \"acc_norm_stderr\": 0.012620785155885996\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085637,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085637\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.030769444967296014,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.030769444967296014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5616226683920723,\n \"mc2_stderr\": 0.015980395758532336\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759989\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3025018953752843,\n \"acc_stderr\": 0.012652544133186141\n }\n}\n```", "repo_url": "https://huggingface.co/BarraHome/zephyr-dpo-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|arc:challenge|25_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|arc:challenge|25_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|gsm8k|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|gsm8k|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hellaswag|10_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hellaswag|10_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T08-57-54.838918.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T08-58-25.311637.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["**/details_harness|winogrande|5_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["**/details_harness|winogrande|5_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T08-58-25.311637.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T08_57_54.838918", "path": ["results_2024-02-04T08-57-54.838918.parquet"]}, {"split": "2024_02_04T08_58_25.311637", "path": ["results_2024-02-04T08-58-25.311637.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T08-58-25.311637.parquet"]}]}]} | 2024-02-04T09:00:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BarraHome/zephyr-dpo-v2
Dataset automatically created during the evaluation run of model BarraHome/zephyr-dpo-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T08:58:25.311637(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BarraHome/zephyr-dpo-v2\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/zephyr-dpo-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T08:58:25.311637(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BarraHome/zephyr-dpo-v2\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/zephyr-dpo-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T08:58:25.311637(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6bfeac5a446ac0b02db6c37f5773791e76bd8cf0 | # Description
3,919 People Multi-pose Faces Data, 24 images and 9 videos per person. The collection environment includes indoor and outdoor scenes. This data can be used for face detection, face recognition and other tasks.
For more details, please visit: https://www.nexdata.ai/datasets/1199?source=Huggingface
# Specifications
## Data size
3,919 people, 24 images and 9 videos per person
## Race distribution
Asians
## Nationality distribution
114 people from Cambodia, 1,951 people from Indonesia, 34 people from Korea, 234 people from Mongolia, 1,107 people from Philippines, 479 people from Vietnam
## Gender distribution
2,046 males, 1,873 females
## Age distribution
1,338 people under 18 years old, 1,975 people aged from 18 to 45, 404 people aged from 46 to 60,202 people over 60 years old
## Collecting environment
including indoor and outdoor scenes
## Data diversity
different face poses, nationalities, ages, light conditions, different scenes
## Device
cellphone
## Data format
the image data format is .jpeg, .jpg; the video data format is .mp4, .mov
## Accuracy
the accuracy of labels of face pose, head pose, nationality, gender, collection environment and age are more than 97%
# Licensing Information
Commercial License | Nexdata/Multi-pose_Faces_Data | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-04T09:09:43+00:00 | {"license": "cc-by-nc-4.0"} | 2024-02-04T10:01:42+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
| # Description
3,919 People Multi-pose Faces Data, 24 images and 9 videos per person. The collection environment includes indoor and outdoor scenes. This data can be used for face detection, face recognition and other tasks.
For more details, please visit: URL
# Specifications
## Data size
3,919 people, 24 images and 9 videos per person
## Race distribution
Asians
## Nationality distribution
114 people from Cambodia, 1,951 people from Indonesia, 34 people from Korea, 234 people from Mongolia, 1,107 people from Philippines, 479 people from Vietnam
## Gender distribution
2,046 males, 1,873 females
## Age distribution
1,338 people under 18 years old, 1,975 people aged from 18 to 45, 404 people aged from 46 to 60,202 people over 60 years old
## Collecting environment
including indoor and outdoor scenes
## Data diversity
different face poses, nationalities, ages, light conditions, different scenes
## Device
cellphone
## Data format
the image data format is .jpeg, .jpg; the video data format is .mp4, .mov
## Accuracy
the accuracy of labels of face pose, head pose, nationality, gender, collection environment and age are more than 97%
# Licensing Information
Commercial License | [
"# Description\n3,919 People Multi-pose Faces Data, 24 images and 9 videos per person. The collection environment includes indoor and outdoor scenes. This data can be used for face detection, face recognition and other tasks.\nFor more details, please visit: URL",
"# Specifications",
"## Data size\n3,919 people, 24 images and 9 videos per person",
"## Race distribution\nAsians",
"## Nationality distribution\n114 people from Cambodia, 1,951 people from Indonesia, 34 people from Korea, 234 people from Mongolia, 1,107 people from Philippines, 479 people from Vietnam",
"## Gender distribution\n2,046 males, 1,873 females",
"## Age distribution\n1,338 people under 18 years old, 1,975 people aged from 18 to 45, 404 people aged from 46 to 60,202 people over 60 years old",
"## Collecting environment\nincluding indoor and outdoor scenes",
"## Data diversity\ndifferent face poses, nationalities, ages, light conditions, different scenes",
"## Device\ncellphone",
"## Data format\nthe image data format is .jpeg, .jpg; the video data format is .mp4, .mov",
"## Accuracy\nthe accuracy of labels of face pose, head pose, nationality, gender, collection environment and age are more than 97%",
"# Licensing Information\nCommercial License"
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"# Description\n3,919 People Multi-pose Faces Data, 24 images and 9 videos per person. The collection environment includes indoor and outdoor scenes. This data can be used for face detection, face recognition and other tasks.\nFor more details, please visit: URL",
"# Specifications",
"## Data size\n3,919 people, 24 images and 9 videos per person",
"## Race distribution\nAsians",
"## Nationality distribution\n114 people from Cambodia, 1,951 people from Indonesia, 34 people from Korea, 234 people from Mongolia, 1,107 people from Philippines, 479 people from Vietnam",
"## Gender distribution\n2,046 males, 1,873 females",
"## Age distribution\n1,338 people under 18 years old, 1,975 people aged from 18 to 45, 404 people aged from 46 to 60,202 people over 60 years old",
"## Collecting environment\nincluding indoor and outdoor scenes",
"## Data diversity\ndifferent face poses, nationalities, ages, light conditions, different scenes",
"## Device\ncellphone",
"## Data format\nthe image data format is .jpeg, .jpg; the video data format is .mp4, .mov",
"## Accuracy\nthe accuracy of labels of face pose, head pose, nationality, gender, collection environment and age are more than 97%",
"# Licensing Information\nCommercial License"
] |
22dd5522a2c646eab853a2999ee9904549af653a | # Description
Micro-expression video data of more than 2,000 people, including Asian, Black, Caucasian and Brown; age includes under 18, 18-45, 46-60, and over 60; collection environment includes indoor scenes and outdoor scenes; it can be used in various scenes such as face recognition and expression recognition.
For more details, please visit: https://www.nexdata.ai/datasets/1275?source=Huggingface
# Specifications
## Data size
57 types, 68,405 videos
## Race distribution
Asian, Black, Caucasian, Brown
## Gender distribution
male , female
## Age distribution
under 18 years old, 18~45 years old, 46~60 years old, over 60 years old
## Collecting environment
including indoor and outdoor scenes
## Collection diversity
57 micro-expressions, multiracial, multiple scenarios
## Collection device
cellphone
## Data format
the video data format is .mp4
## Collection content
collecting multiple micro-expression video data of different subjects
## Accuracy rate
according to the accuracy of the acquisition action, the accuracy exceeds 97%; the accuracy of label annotation is over 97%
# Licensing Information
Commercial License
| Nexdata/57_Types_of_Micro-expression_Data | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-04T09:12:06+00:00 | {"license": "cc-by-nc-4.0"} | 2024-02-04T10:02:03+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
| # Description
Micro-expression video data of more than 2,000 people, including Asian, Black, Caucasian and Brown; age includes under 18, 18-45, 46-60, and over 60; collection environment includes indoor scenes and outdoor scenes; it can be used in various scenes such as face recognition and expression recognition.
For more details, please visit: URL
# Specifications
## Data size
57 types, 68,405 videos
## Race distribution
Asian, Black, Caucasian, Brown
## Gender distribution
male , female
## Age distribution
under 18 years old, 18~45 years old, 46~60 years old, over 60 years old
## Collecting environment
including indoor and outdoor scenes
## Collection diversity
57 micro-expressions, multiracial, multiple scenarios
## Collection device
cellphone
## Data format
the video data format is .mp4
## Collection content
collecting multiple micro-expression video data of different subjects
## Accuracy rate
according to the accuracy of the acquisition action, the accuracy exceeds 97%; the accuracy of label annotation is over 97%
# Licensing Information
Commercial License
| [
"# Description\nMicro-expression video data of more than 2,000 people, including Asian, Black, Caucasian and Brown; age includes under 18, 18-45, 46-60, and over 60; collection environment includes indoor scenes and outdoor scenes; it can be used in various scenes such as face recognition and expression recognition.\nFor more details, please visit: URL",
"# Specifications",
"## Data size\n57 types, 68,405 videos",
"## Race distribution\nAsian, Black, Caucasian, Brown",
"## Gender distribution\nmale , female",
"## Age distribution\nunder 18 years old, 18~45 years old, 46~60 years old, over 60 years old",
"## Collecting environment\nincluding indoor and outdoor scenes",
"## Collection diversity\n57 micro-expressions, multiracial, multiple scenarios",
"## Collection device\ncellphone",
"## Data format\nthe video data format is .mp4",
"## Collection content\ncollecting multiple micro-expression video data of different subjects",
"## Accuracy rate\naccording to the accuracy of the acquisition action, the accuracy exceeds 97%; the accuracy of label annotation is over 97%",
"# Licensing Information\nCommercial License"
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"# Description\nMicro-expression video data of more than 2,000 people, including Asian, Black, Caucasian and Brown; age includes under 18, 18-45, 46-60, and over 60; collection environment includes indoor scenes and outdoor scenes; it can be used in various scenes such as face recognition and expression recognition.\nFor more details, please visit: URL",
"# Specifications",
"## Data size\n57 types, 68,405 videos",
"## Race distribution\nAsian, Black, Caucasian, Brown",
"## Gender distribution\nmale , female",
"## Age distribution\nunder 18 years old, 18~45 years old, 46~60 years old, over 60 years old",
"## Collecting environment\nincluding indoor and outdoor scenes",
"## Collection diversity\n57 micro-expressions, multiracial, multiple scenarios",
"## Collection device\ncellphone",
"## Data format\nthe video data format is .mp4",
"## Collection content\ncollecting multiple micro-expression video data of different subjects",
"## Accuracy rate\naccording to the accuracy of the acquisition action, the accuracy exceeds 97%; the accuracy of label annotation is over 97%",
"# Licensing Information\nCommercial License"
] |
d5e084263a28d0307998ce07f5b3494482428b25 |
### Description
This dataset is derived from the already existing dataset made by AI4Bharat. We have used the [IndicXParaphrase](https://huggingface.co/datasets/ai4bharat/IndicXParaphrase) dataset of AI4Bharat to create this instruction style dataset.
This was created as part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI.
IndicXParaphrase is multilingual, and n-way parallel dataset for paraphrase detection in 10 Indic languages. The original dataset(IndicXParaphrase) was made available under the cc-0 license.
### Template
The following templates where used for converting the original dataset:
```
#Template 1
prompt:
Write the following sentence using different words: "{original_sentence}"
completion:
{paraphrased_sentence}
```
```
#Template 2
prompt:
Rewrite the following sentence in different way: "{original_sentence}"
completion:
{paraphrased_sentence}
```
```
#Template 3
prompt:
Paraphrase the following sentence:: "{original_sentence}"
completion:
{paraphrased_sentence}
```
### Acknowledgement
Thank you, Jay Patel for helping by providing the Gujarati translations, Amarjit for helping by providing the Punjabi translations,
Yogesh Haribhau Kulkarni for helping by providing the Marathi translations,
Ganesh Jagadeesan for helping by providing the Hindi translations and Tahmid Hossain for helping by providing the Bengali translations of the above mentioned English prompts. | el2e10/aya-paraphrase | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:ml",
"language:gu",
"language:mr",
"language:hi",
"language:pa",
"language:bn",
"license:cc",
"region:us"
] | 2024-02-04T09:22:28+00:00 | {"language": ["ml", "gu", "mr", "hi", "pa", "bn"], "license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "pretty_name": "Aya Paraphrase", "configs": [{"config_name": "default", "data_files": [{"split": "mal", "path": "data/mal.parquet"}, {"split": "ben", "path": "data/ben.parquet"}, {"split": "guj", "path": "data/guj.parquet"}, {"split": "hin", "path": "data/hin.parquet"}, {"split": "mar", "path": "data/mar.parquet"}, {"split": "pan", "path": "data/pan.parquet"}]}]} | 2024-02-04T10:15:11+00:00 | [] | [
"ml",
"gu",
"mr",
"hi",
"pa",
"bn"
] | TAGS
#task_categories-text-generation #size_categories-1K<n<10K #language-Malayalam #language-Gujarati #language-Marathi #language-Hindi #language-Panjabi #language-Bengali #license-cc #region-us
|
### Description
This dataset is derived from the already existing dataset made by AI4Bharat. We have used the IndicXParaphrase dataset of AI4Bharat to create this instruction style dataset.
This was created as part of Aya Open Science Initiative from Cohere For AI.
IndicXParaphrase is multilingual, and n-way parallel dataset for paraphrase detection in 10 Indic languages. The original dataset(IndicXParaphrase) was made available under the cc-0 license.
### Template
The following templates where used for converting the original dataset:
### Acknowledgement
Thank you, Jay Patel for helping by providing the Gujarati translations, Amarjit for helping by providing the Punjabi translations,
Yogesh Haribhau Kulkarni for helping by providing the Marathi translations,
Ganesh Jagadeesan for helping by providing the Hindi translations and Tahmid Hossain for helping by providing the Bengali translations of the above mentioned English prompts. | [
"### Description\n\nThis dataset is derived from the already existing dataset made by AI4Bharat. We have used the IndicXParaphrase dataset of AI4Bharat to create this instruction style dataset. \nThis was created as part of Aya Open Science Initiative from Cohere For AI.\n\nIndicXParaphrase is multilingual, and n-way parallel dataset for paraphrase detection in 10 Indic languages. The original dataset(IndicXParaphrase) was made available under the cc-0 license.",
"### Template\n\nThe following templates where used for converting the original dataset:",
"### Acknowledgement\nThank you, Jay Patel for helping by providing the Gujarati translations, Amarjit for helping by providing the Punjabi translations,\nYogesh Haribhau Kulkarni for helping by providing the Marathi translations, \nGanesh Jagadeesan for helping by providing the Hindi translations and Tahmid Hossain for helping by providing the Bengali translations of the above mentioned English prompts."
] | [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Malayalam #language-Gujarati #language-Marathi #language-Hindi #language-Panjabi #language-Bengali #license-cc #region-us \n",
"### Description\n\nThis dataset is derived from the already existing dataset made by AI4Bharat. We have used the IndicXParaphrase dataset of AI4Bharat to create this instruction style dataset. \nThis was created as part of Aya Open Science Initiative from Cohere For AI.\n\nIndicXParaphrase is multilingual, and n-way parallel dataset for paraphrase detection in 10 Indic languages. The original dataset(IndicXParaphrase) was made available under the cc-0 license.",
"### Template\n\nThe following templates where used for converting the original dataset:",
"### Acknowledgement\nThank you, Jay Patel for helping by providing the Gujarati translations, Amarjit for helping by providing the Punjabi translations,\nYogesh Haribhau Kulkarni for helping by providing the Marathi translations, \nGanesh Jagadeesan for helping by providing the Hindi translations and Tahmid Hossain for helping by providing the Bengali translations of the above mentioned English prompts."
] |
a8e32c56bdb1341842e25471ab2f1d0b12ed542d | ## Description
8,643 Images - 14 Types of Abnormal Images & Videos Data. The data includes indoor scenes (library, craft store, etc.) and outdoor scenes (road, building, square, railway station, etc.). The data diversity includes multiple scenes, 14 types of abnormal videos & images data, different light conditions, different image resolutions. The data can be used for tasks such as image deblurring and image denoising.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1043?source=Huggingface
## Data size
8,643 images, 14 types
## Collecting environment
indoor scenes (library, craft store, etc.), outdoor scenes (road, building, square, railway station, etc.)
## Data diversity
including multiple scenes, 14 types of abnormal videos & images data, different light conditions, different image resolutions
## Device
cellphone, camera
## Data format
the video data format is .mp4, the image data format is .jpg or .png
## Collecting content
video data includes Dynamic Blur, Blocking Artifact; image data includes Abnormal Saturation, Abnormal Contrast Ratio, Resolution Reduction, Blurry Image, Blurry Scene, Noise Image, Noise-like Image, Old Image, Black Frame, Color-bar Noise, Streak Shaped Noise and Lens Occlusion
## Accuracy
according to the Collection content, the collecting accuracy is over 97%
# Licensing Information
Commercial License
| Nexdata/14_Types_of_Abnormal_Images_Videos_Data | [
"license:cc-by-nc-3.0",
"region:us"
] | 2024-02-04T09:24:59+00:00 | {"license": "cc-by-nc-3.0"} | 2024-02-04T10:05:07+00:00 | [] | [] | TAGS
#license-cc-by-nc-3.0 #region-us
| ## Description
8,643 Images - 14 Types of Abnormal Images & Videos Data. The data includes indoor scenes (library, craft store, etc.) and outdoor scenes (road, building, square, railway station, etc.). The data diversity includes multiple scenes, 14 types of abnormal videos & images data, different light conditions, different image resolutions. The data can be used for tasks such as image deblurring and image denoising.
For more details, please refer to the link: URL
## Data size
8,643 images, 14 types
## Collecting environment
indoor scenes (library, craft store, etc.), outdoor scenes (road, building, square, railway station, etc.)
## Data diversity
including multiple scenes, 14 types of abnormal videos & images data, different light conditions, different image resolutions
## Device
cellphone, camera
## Data format
the video data format is .mp4, the image data format is .jpg or .png
## Collecting content
video data includes Dynamic Blur, Blocking Artifact; image data includes Abnormal Saturation, Abnormal Contrast Ratio, Resolution Reduction, Blurry Image, Blurry Scene, Noise Image, Noise-like Image, Old Image, Black Frame, Color-bar Noise, Streak Shaped Noise and Lens Occlusion
## Accuracy
according to the Collection content, the collecting accuracy is over 97%
# Licensing Information
Commercial License
| [
"## Description\n8,643 Images - 14 Types of Abnormal Images & Videos Data. The data includes indoor scenes (library, craft store, etc.) and outdoor scenes (road, building, square, railway station, etc.). The data diversity includes multiple scenes, 14 types of abnormal videos & images data, different light conditions, different image resolutions. The data can be used for tasks such as image deblurring and image denoising.\n\nFor more details, please refer to the link: URL",
"## Data size\n8,643 images, 14 types",
"## Collecting environment\nindoor scenes (library, craft store, etc.), outdoor scenes (road, building, square, railway station, etc.)",
"## Data diversity\nincluding multiple scenes, 14 types of abnormal videos & images data, different light conditions, different image resolutions",
"## Device\ncellphone, camera",
"## Data format\nthe video data format is .mp4, the image data format is .jpg or .png",
"## Collecting content\nvideo data includes Dynamic Blur, Blocking Artifact; image data includes Abnormal Saturation, Abnormal Contrast Ratio, Resolution Reduction, Blurry Image, Blurry Scene, Noise Image, Noise-like Image, Old Image, Black Frame, Color-bar Noise, Streak Shaped Noise and Lens Occlusion",
"## Accuracy\naccording to the Collection content, the collecting accuracy is over 97%",
"# Licensing Information\nCommercial License"
] | [
"TAGS\n#license-cc-by-nc-3.0 #region-us \n",
"## Description\n8,643 Images - 14 Types of Abnormal Images & Videos Data. The data includes indoor scenes (library, craft store, etc.) and outdoor scenes (road, building, square, railway station, etc.). The data diversity includes multiple scenes, 14 types of abnormal videos & images data, different light conditions, different image resolutions. The data can be used for tasks such as image deblurring and image denoising.\n\nFor more details, please refer to the link: URL",
"## Data size\n8,643 images, 14 types",
"## Collecting environment\nindoor scenes (library, craft store, etc.), outdoor scenes (road, building, square, railway station, etc.)",
"## Data diversity\nincluding multiple scenes, 14 types of abnormal videos & images data, different light conditions, different image resolutions",
"## Device\ncellphone, camera",
"## Data format\nthe video data format is .mp4, the image data format is .jpg or .png",
"## Collecting content\nvideo data includes Dynamic Blur, Blocking Artifact; image data includes Abnormal Saturation, Abnormal Contrast Ratio, Resolution Reduction, Blurry Image, Blurry Scene, Noise Image, Noise-like Image, Old Image, Black Frame, Color-bar Noise, Streak Shaped Noise and Lens Occlusion",
"## Accuracy\naccording to the Collection content, the collecting accuracy is over 97%",
"# Licensing Information\nCommercial License"
] |
0373a533ae8d06fc0ac187877fc4a763127c64da |
# Dataset Card for Evaluation run of Kquant03/Nanashi-2x7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Nanashi-2x7B-bf16](https://huggingface.co/Kquant03/Nanashi-2x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Nanashi-2x7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T09:23:05.905674](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Nanashi-2x7B-bf16/blob/main/results_2024-02-04T09-23-05.905674.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6565798086186085,
"acc_stderr": 0.03193575998838622,
"acc_norm": 0.6557702052622627,
"acc_norm_stderr": 0.032610001728159095,
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7130522065140252,
"mc2_stderr": 0.01482641985542771
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725225,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710698
},
"harness|hellaswag|10": {
"acc": 0.7157936666002789,
"acc_stderr": 0.004501137895230723,
"acc_norm": 0.887572196773551,
"acc_norm_stderr": 0.003152464637757646
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7130522065140252,
"mc2_stderr": 0.01482641985542771
},
"harness|winogrande|5": {
"acc": 0.8610891870560379,
"acc_stderr": 0.009720200907402105
},
"harness|gsm8k|5": {
"acc": 0.6997725549658832,
"acc_stderr": 0.01262542315228303
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Nanashi-2x7B-bf16 | [
"region:us"
] | 2024-02-04T09:25:23+00:00 | {"pretty_name": "Evaluation run of Kquant03/Nanashi-2x7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Nanashi-2x7B-bf16](https://huggingface.co/Kquant03/Nanashi-2x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Nanashi-2x7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T09:23:05.905674](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Nanashi-2x7B-bf16/blob/main/results_2024-02-04T09-23-05.905674.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6565798086186085,\n \"acc_stderr\": 0.03193575998838622,\n \"acc_norm\": 0.6557702052622627,\n \"acc_norm_stderr\": 0.032610001728159095,\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7130522065140252,\n \"mc2_stderr\": 0.01482641985542771\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725225,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710698\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7157936666002789,\n \"acc_stderr\": 0.004501137895230723,\n \"acc_norm\": 0.887572196773551,\n \"acc_norm_stderr\": 0.003152464637757646\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7130522065140252,\n \"mc2_stderr\": 0.01482641985542771\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8610891870560379,\n \"acc_stderr\": 0.009720200907402105\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \"acc_stderr\": 0.01262542315228303\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Nanashi-2x7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-23-05.905674.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["**/details_harness|winogrande|5_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T09-23-05.905674.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T09_23_05.905674", "path": ["results_2024-02-04T09-23-05.905674.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T09-23-05.905674.parquet"]}]}]} | 2024-02-04T09:25:47+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Nanashi-2x7B-bf16
Dataset automatically created during the evaluation run of model Kquant03/Nanashi-2x7B-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T09:23:05.905674(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Nanashi-2x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Nanashi-2x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:23:05.905674(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Nanashi-2x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Nanashi-2x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:23:05.905674(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7083679a108ba77b5eafb3886a1b93ff75783a68 |
# Dataset Card for Evaluation run of ibivibiv/multimaster-7b-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ibivibiv/multimaster-7b-v3](https://huggingface.co/ibivibiv/multimaster-7b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ibivibiv__multimaster-7b-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T09:25:06.873514](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v3/blob/main/results_2024-02-04T09-25-06.873514.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6560385561881283,
"acc_stderr": 0.031932333372589335,
"acc_norm": 0.6553946325781543,
"acc_norm_stderr": 0.03260163732633882,
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.5970305973742662,
"mc2_stderr": 0.01550484415432735
},
"harness|arc:challenge|25": {
"acc": 0.6715017064846417,
"acc_stderr": 0.013724978465537297,
"acc_norm": 0.7039249146757679,
"acc_norm_stderr": 0.013340916085246254
},
"harness|hellaswag|10": {
"acc": 0.7027484564827724,
"acc_stderr": 0.004561141293448457,
"acc_norm": 0.8765186217884884,
"acc_norm_stderr": 0.0032831658676313624
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944437,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971114,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971114
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568603,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568603
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053755,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053755
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066304,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.01659802212058043,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.01659802212058043
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135107,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.018690850273595284,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.018690850273595284
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.5970305973742662,
"mc2_stderr": 0.01550484415432735
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.7156937073540561,
"acc_stderr": 0.012425078188395975
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ibivibiv__multimaster-7b-v3 | [
"region:us"
] | 2024-02-04T09:27:26+00:00 | {"pretty_name": "Evaluation run of ibivibiv/multimaster-7b-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [ibivibiv/multimaster-7b-v3](https://huggingface.co/ibivibiv/multimaster-7b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibivibiv__multimaster-7b-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T09:25:06.873514](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v3/blob/main/results_2024-02-04T09-25-06.873514.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6560385561881283,\n \"acc_stderr\": 0.031932333372589335,\n \"acc_norm\": 0.6553946325781543,\n \"acc_norm_stderr\": 0.03260163732633882,\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.5970305973742662,\n \"mc2_stderr\": 0.01550484415432735\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537297,\n \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.013340916085246254\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7027484564827724,\n \"acc_stderr\": 0.004561141293448457,\n \"acc_norm\": 0.8765186217884884,\n \"acc_norm_stderr\": 0.0032831658676313624\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944437,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944437\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971114,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971114\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568603,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568603\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053755,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053755\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066304,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066304\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n \"acc_stderr\": 0.01659802212058043,\n \"acc_norm\": 0.43910614525139663,\n \"acc_norm_stderr\": 0.01659802212058043\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135107,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.018690850273595284,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.018690850273595284\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.5970305973742662,\n \"mc2_stderr\": 0.01550484415432735\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7156937073540561,\n \"acc_stderr\": 0.012425078188395975\n }\n}\n```", "repo_url": "https://huggingface.co/ibivibiv/multimaster-7b-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-25-06.873514.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["**/details_harness|winogrande|5_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T09-25-06.873514.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T09_25_06.873514", "path": ["results_2024-02-04T09-25-06.873514.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T09-25-06.873514.parquet"]}]}]} | 2024-02-04T09:27:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ibivibiv/multimaster-7b-v3
Dataset automatically created during the evaluation run of model ibivibiv/multimaster-7b-v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T09:25:06.873514(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ibivibiv/multimaster-7b-v3\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/multimaster-7b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:25:06.873514(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ibivibiv/multimaster-7b-v3\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/multimaster-7b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:25:06.873514(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.