sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
76bf398380d131df7806055616c5cff9d21c0f7e | # Dataset Card for "mmlu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | nbalepur/mmlu | [
"region:us"
] | 2024-01-12T20:50:41+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "dataset", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer_letter", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 8385373.5945069045, "num_examples": 14032}], "download_size": 3488361, "dataset_size": 8385373.5945069045}} | 2024-01-12T20:50:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mmlu"
More Information needed | [
"# Dataset Card for \"mmlu\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mmlu\"\n\nMore Information needed"
] |
c332a94af654f30e9a4944c2e4027f6d147d1c57 | 467 Project Gutenberg books, mostly of older provenance (author died pre-1914), chapterized by Chapter Captor (https://arxiv.org/abs/2011.04163), then requiring the number of chapters is correct and the assigned numbers are sequential starting at 1.
Each line is one chapter of a book.
Keys are "chapter_number", "text", "title", and "metadata{"id"}"" The id is the Gutenberg book number. Title is often not present. | afoland/chapterized_PG | [
"license:apache-2.0",
"arxiv:2011.04163",
"region:us"
] | 2024-01-12T21:29:31+00:00 | {"license": "apache-2.0"} | 2024-01-13T01:21:24+00:00 | [
"2011.04163"
] | [] | TAGS
#license-apache-2.0 #arxiv-2011.04163 #region-us
| 467 Project Gutenberg books, mostly of older provenance (author died pre-1914), chapterized by Chapter Captor (URL then requiring the number of chapters is correct and the assigned numbers are sequential starting at 1.
Each line is one chapter of a book.
Keys are "chapter_number", "text", "title", and "metadata{"id"}"" The id is the Gutenberg book number. Title is often not present. | [] | [
"TAGS\n#license-apache-2.0 #arxiv-2011.04163 #region-us \n"
] |
c3f8f32dbba68eaabccf8ff08bd5783202aaa564 |
# Dataset Card for Medical texts simplification}}
The dataset consisting of 30 triples (around 800 sentences) of the original text,
human- and ChatGPT-simplified texts was created from a subset [Medical Notes Classification dataset](https://www.kaggle.com/competitions/medicalnotes-2019/rules).
The original dataset contains medical notes, which come from exactly one of the following five clinical domains:
Gastroenterology, Neurology, Orthopedics, Radiology, and Urology. There are 1239 texts in total in the original dataset.
## Dataset Details
### Dataset Description
- **Curated by:** Liliya Makhmutova, Giancarlo Salton, Fernando Perez-Tellez, and Robert Ross
- **Funded by:** Science Foundation Ireland under Grant number 18/CRT/6183 and the ADAPT SFI Research Centre for AI-Driven Digital Content Technology under Grant No. 13/RC/2106/P2
- **Language(s) (NLP):** English
- **License:** CC-BY-NC-4.0
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [GitHub](https://github.com/LiliyaMakhmutova/medical_texts_simplification/)
<!--- **Paper [optional]:** {{ paper | default("[More Information Needed]", true)}} -->
<!--- **Demo [optional]:** {{ demo | default("[More Information Needed]", true)}} -->
## Uses
Should not be used for any kind of patient treatment without medical professional supervision.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Columns in texts.csv:
- **file_number** (int): a filename (numeric) in the original dataset
- **line_number** (int): the number of a sentence in the original dataset
- **original** (str): sentence of the original text (preprocessed)
- **human_simplification** (str): sentence of the text simplified by a human
- **chatgpt_simplification** (str): sentence of the text simplified by the ChatGPT
- **images** (list[str]): list of image file names related to a sentence that helps to understand a medical text
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
Medical texts can be very difficult to understand for patients, which may lead to health problems.
More importantly, patients often don’t have access to their medical records, and where they have,
the patients often cannot understand the meaning due to the very different mental models and background
knowledge that patients and clinicians have.
This leads to patients’ partial exclusion from the recovery process and sub-optimal outcomes.
Medical texts usually contain lots of special terminology, many abbreviations,
lack of coordination, subordination, and explanations in sentences making it harder to understand causal relationships.
Moreover, medical texts usually consist of short ungrammatical sentences.
This makes their understanding difficult not only for laymen but also for healthcare professionals from other fields.
Given these challenges, a machine learning model for medical text simplifications
may be very beneficial both in terms of democratising information access and improving outcomes.
Although a model under no circumstances should add irrelevant information (making up some facts),
it may incorporate true knowledge that is not mentioned in a report to make a medical text clearer and more understandable for a patient.
So, for example, a model might add "Your mother or father are likely to have similar conditions too"
explaining "genetic" reasons, but should not judge whether the blood sugar level in a patient is normal or not.
### Source Data
Based on the [Medical Notes Classification dataset](https://www.kaggle.com/competitions/medicalnotes-2019/rules).
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
The texts from the [Medical Notes Classification dataset](https://www.kaggle.com/competitions/medicalnotes-2019/rules)
were preprocessed in the following way to create the **original** column:
- HTML tags were removed
- Minor typos were fixed
- The texts were divided by sentences, each of which was enumerated
Simplified text was created out of complex text under congruence (preserve the original information and not add extra information),
fluency (readability), and simplicity principles.
by a non-native English speaker with no medical or healthcare professional background.
The dataset hasn't been reviewed by a medical professional, the simplifications were created manually with the help of open resources
including medical papers, surgerical videos explaining the procedures, and medical articles with simplified explanations.
For some sentences, pictorial explanations were added for a better understanding.
Automatically created simplifications were obtained through the use of the OpenAI chat interface where the following prompt was used:
"Please simplify the text so that non-professionals could understand it".
ChatGPT tends to produce summarization rather than simplification on longer texts, so, for long texts (typically more than 20 sentences),
the text was inputted by parts (with the following prompt after the main one within the same chat context:
"Could you also simplify one more follow-up text so that non-professional could understand it: \<NEXT PART OF THE COMPLEX TEXT\>'').
It was decided not to add any examples or special prompting techniques for clarity reasons.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
- Should not be used for any kind of patient treatment without medical professional supervision.
- The dataset is not for commercial use.
- If a model is trained on the dataset to solve a simplification task, it may produce hallucinations.
The outputs of ChatGPT from the **chatgpt_simplification** column were analysed and it was found that:
1. ChatGPT can disclose abbreviations depending on the context.
2. ChatGPT has a very good rewriting ability (this is related to both general language skills and the ability to understand and simplify medical terms).
and
1. ChatGPT tends to produce abstracts or summarizations rather than simplification on long texts.
This may be explained by the limited length of the context.
2. ChatGPT sometimes makes up some facts, which may be very dangerous in such a sensitive field as medicine. ChatGPT may even contradict its output.
3. ChatGPT somehow lacks commonsense reasoning or medical "knowledge".
4. ChatGPT may omit important facts or oversimplify.
5. ChatGPT is biased towards rewriting a text by any means, even if it has been already quite simple. Sometimes the rewriting may change the meaning.
ChatGPT also tends to produce more personal sentences.
6. ChatGPT sometimes uses words such as "a", "about", "some", "called", rather than properly simplify a concept or explain.
It also frequently outputs undersimplifications.
Examples and analysis may be found on [GitHub](https://github.com/LiliyaMakhmutova/medical_texts_simplification/).
## Dataset Card Contact
[Liliya Makhmutova](https://www.linkedin.com/in/liliya-makhmutova/) | liliya-makhmutova/medical_texts_simplification | [
"task_categories:text-generation",
"size_categories:n<1K",
"language:en",
"license:cc-by-nc-4.0",
"medical",
"simplification",
"region:us"
] | 2024-01-12T22:10:45+00:00 | {"language": ["en"], "license": "cc-by-nc-4.0", "size_categories": ["n<1K"], "task_categories": ["text-generation"], "pretty_name": "Medical texts simplification", "tags": ["medical", "simplification"], "dataset_info": {"features": [{"name": "file_number", "dtype": "int32"}, {"name": "line_number", "dtype": "int32"}, {"name": "original", "dtype": "string"}, {"name": "human_simplification", "dtype": "string"}, {"name": "chatgpt_simplification", "dtype": "string"}, {"name": "images", "dtype": "image"}]}, "viewer": false} | 2024-01-15T17:59:36+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-n<1K #language-English #license-cc-by-nc-4.0 #medical #simplification #region-us
|
# Dataset Card for Medical texts simplification}}
The dataset consisting of 30 triples (around 800 sentences) of the original text,
human- and ChatGPT-simplified texts was created from a subset Medical Notes Classification dataset.
The original dataset contains medical notes, which come from exactly one of the following five clinical domains:
Gastroenterology, Neurology, Orthopedics, Radiology, and Urology. There are 1239 texts in total in the original dataset.
## Dataset Details
### Dataset Description
- Curated by: Liliya Makhmutova, Giancarlo Salton, Fernando Perez-Tellez, and Robert Ross
- Funded by: Science Foundation Ireland under Grant number 18/CRT/6183 and the ADAPT SFI Research Centre for AI-Driven Digital Content Technology under Grant No. 13/RC/2106/P2
- Language(s) (NLP): English
- License: CC-BY-NC-4.0
### Dataset Sources [optional]
- Repository: GitHub
## Uses
Should not be used for any kind of patient treatment without medical professional supervision.
## Dataset Structure
Columns in URL:
- file_number (int): a filename (numeric) in the original dataset
- line_number (int): the number of a sentence in the original dataset
- original (str): sentence of the original text (preprocessed)
- human_simplification (str): sentence of the text simplified by a human
- chatgpt_simplification (str): sentence of the text simplified by the ChatGPT
- images (list[str]): list of image file names related to a sentence that helps to understand a medical text
## Dataset Creation
### Curation Rationale
Medical texts can be very difficult to understand for patients, which may lead to health problems.
More importantly, patients often don’t have access to their medical records, and where they have,
the patients often cannot understand the meaning due to the very different mental models and background
knowledge that patients and clinicians have.
This leads to patients’ partial exclusion from the recovery process and sub-optimal outcomes.
Medical texts usually contain lots of special terminology, many abbreviations,
lack of coordination, subordination, and explanations in sentences making it harder to understand causal relationships.
Moreover, medical texts usually consist of short ungrammatical sentences.
This makes their understanding difficult not only for laymen but also for healthcare professionals from other fields.
Given these challenges, a machine learning model for medical text simplifications
may be very beneficial both in terms of democratising information access and improving outcomes.
Although a model under no circumstances should add irrelevant information (making up some facts),
it may incorporate true knowledge that is not mentioned in a report to make a medical text clearer and more understandable for a patient.
So, for example, a model might add "Your mother or father are likely to have similar conditions too"
explaining "genetic" reasons, but should not judge whether the blood sugar level in a patient is normal or not.
### Source Data
Based on the Medical Notes Classification dataset.
#### Data Collection and Processing
The texts from the Medical Notes Classification dataset
were preprocessed in the following way to create the original column:
- HTML tags were removed
- Minor typos were fixed
- The texts were divided by sentences, each of which was enumerated
Simplified text was created out of complex text under congruence (preserve the original information and not add extra information),
fluency (readability), and simplicity principles.
by a non-native English speaker with no medical or healthcare professional background.
The dataset hasn't been reviewed by a medical professional, the simplifications were created manually with the help of open resources
including medical papers, surgerical videos explaining the procedures, and medical articles with simplified explanations.
For some sentences, pictorial explanations were added for a better understanding.
Automatically created simplifications were obtained through the use of the OpenAI chat interface where the following prompt was used:
"Please simplify the text so that non-professionals could understand it".
ChatGPT tends to produce summarization rather than simplification on longer texts, so, for long texts (typically more than 20 sentences),
the text was inputted by parts (with the following prompt after the main one within the same chat context:
"Could you also simplify one more follow-up text so that non-professional could understand it: \<NEXT PART OF THE COMPLEX TEXT\>'').
It was decided not to add any examples or special prompting techniques for clarity reasons.
### Recommendations
- Should not be used for any kind of patient treatment without medical professional supervision.
- The dataset is not for commercial use.
- If a model is trained on the dataset to solve a simplification task, it may produce hallucinations.
The outputs of ChatGPT from the chatgpt_simplification column were analysed and it was found that:
1. ChatGPT can disclose abbreviations depending on the context.
2. ChatGPT has a very good rewriting ability (this is related to both general language skills and the ability to understand and simplify medical terms).
and
1. ChatGPT tends to produce abstracts or summarizations rather than simplification on long texts.
This may be explained by the limited length of the context.
2. ChatGPT sometimes makes up some facts, which may be very dangerous in such a sensitive field as medicine. ChatGPT may even contradict its output.
3. ChatGPT somehow lacks commonsense reasoning or medical "knowledge".
4. ChatGPT may omit important facts or oversimplify.
5. ChatGPT is biased towards rewriting a text by any means, even if it has been already quite simple. Sometimes the rewriting may change the meaning.
ChatGPT also tends to produce more personal sentences.
6. ChatGPT sometimes uses words such as "a", "about", "some", "called", rather than properly simplify a concept or explain.
It also frequently outputs undersimplifications.
Examples and analysis may be found on GitHub.
## Dataset Card Contact
Liliya Makhmutova | [
"# Dataset Card for Medical texts simplification}}\n\nThe dataset consisting of 30 triples (around 800 sentences) of the original text, \nhuman- and ChatGPT-simplified texts was created from a subset Medical Notes Classification dataset. \nThe original dataset contains medical notes, which come from exactly one of the following five clinical domains: \nGastroenterology, Neurology, Orthopedics, Radiology, and Urology. There are 1239 texts in total in the original dataset.",
"## Dataset Details",
"### Dataset Description\n\n- Curated by: Liliya Makhmutova, Giancarlo Salton, Fernando Perez-Tellez, and Robert Ross\n- Funded by: Science Foundation Ireland under Grant number 18/CRT/6183 and the ADAPT SFI Research Centre for AI-Driven Digital Content Technology under Grant No. 13/RC/2106/P2\n- Language(s) (NLP): English\n- License: CC-BY-NC-4.0",
"### Dataset Sources [optional]\n\n\n\n- Repository: GitHub",
"## Uses\n\nShould not be used for any kind of patient treatment without medical professional supervision.",
"## Dataset Structure\n\n\n\nColumns in URL:\n- file_number (int): a filename (numeric) in the original dataset\n- line_number (int): the number of a sentence in the original dataset\n- original (str): sentence of the original text (preprocessed)\n- human_simplification (str): sentence of the text simplified by a human\n- chatgpt_simplification (str): sentence of the text simplified by the ChatGPT\n- images (list[str]): list of image file names related to a sentence that helps to understand a medical text",
"## Dataset Creation",
"### Curation Rationale\n\n\n\nMedical texts can be very difficult to understand for patients, which may lead to health problems. \nMore importantly, patients often don’t have access to their medical records, and where they have, \nthe patients often cannot understand the meaning due to the very different mental models and background \nknowledge that patients and clinicians have. \nThis leads to patients’ partial exclusion from the recovery process and sub-optimal outcomes. \nMedical texts usually contain lots of special terminology, many abbreviations, \nlack of coordination, subordination, and explanations in sentences making it harder to understand causal relationships. \nMoreover, medical texts usually consist of short ungrammatical sentences. \nThis makes their understanding difficult not only for laymen but also for healthcare professionals from other fields. \nGiven these challenges, a machine learning model for medical text simplifications \nmay be very beneficial both in terms of democratising information access and improving outcomes. \nAlthough a model under no circumstances should add irrelevant information (making up some facts), \nit may incorporate true knowledge that is not mentioned in a report to make a medical text clearer and more understandable for a patient. \nSo, for example, a model might add \"Your mother or father are likely to have similar conditions too\" \nexplaining \"genetic\" reasons, but should not judge whether the blood sugar level in a patient is normal or not.",
"### Source Data\n\nBased on the Medical Notes Classification dataset.",
"#### Data Collection and Processing\n\n\n\nThe texts from the Medical Notes Classification dataset \nwere preprocessed in the following way to create the original column:\n\n- HTML tags were removed\n- Minor typos were fixed\n- The texts were divided by sentences, each of which was enumerated\n\nSimplified text was created out of complex text under congruence (preserve the original information and not add extra information), \nfluency (readability), and simplicity principles.\nby a non-native English speaker with no medical or healthcare professional background. \nThe dataset hasn't been reviewed by a medical professional, the simplifications were created manually with the help of open resources \nincluding medical papers, surgerical videos explaining the procedures, and medical articles with simplified explanations. \nFor some sentences, pictorial explanations were added for a better understanding.\n\nAutomatically created simplifications were obtained through the use of the OpenAI chat interface where the following prompt was used: \n\"Please simplify the text so that non-professionals could understand it\". \nChatGPT tends to produce summarization rather than simplification on longer texts, so, for long texts (typically more than 20 sentences), \nthe text was inputted by parts (with the following prompt after the main one within the same chat context: \n\"Could you also simplify one more follow-up text so that non-professional could understand it: \\<NEXT PART OF THE COMPLEX TEXT\\>''). \nIt was decided not to add any examples or special prompting techniques for clarity reasons.",
"### Recommendations\n\n\n\n\n- Should not be used for any kind of patient treatment without medical professional supervision.\n- The dataset is not for commercial use.\n- If a model is trained on the dataset to solve a simplification task, it may produce hallucinations.\n\nThe outputs of ChatGPT from the chatgpt_simplification column were analysed and it was found that:\n\n1. ChatGPT can disclose abbreviations depending on the context.\n2. ChatGPT has a very good rewriting ability (this is related to both general language skills and the ability to understand and simplify medical terms).\n\nand \n\n1. ChatGPT tends to produce abstracts or summarizations rather than simplification on long texts.\n This may be explained by the limited length of the context.\n2. ChatGPT sometimes makes up some facts, which may be very dangerous in such a sensitive field as medicine. ChatGPT may even contradict its output.\n3. ChatGPT somehow lacks commonsense reasoning or medical \"knowledge\".\n4. ChatGPT may omit important facts or oversimplify.\n5. ChatGPT is biased towards rewriting a text by any means, even if it has been already quite simple. Sometimes the rewriting may change the meaning.\n ChatGPT also tends to produce more personal sentences.\n6. ChatGPT sometimes uses words such as \"a\", \"about\", \"some\", \"called\", rather than properly simplify a concept or explain.\n It also frequently outputs undersimplifications.\n\nExamples and analysis may be found on GitHub.",
"## Dataset Card Contact\n\nLiliya Makhmutova"
] | [
"TAGS\n#task_categories-text-generation #size_categories-n<1K #language-English #license-cc-by-nc-4.0 #medical #simplification #region-us \n",
"# Dataset Card for Medical texts simplification}}\n\nThe dataset consisting of 30 triples (around 800 sentences) of the original text, \nhuman- and ChatGPT-simplified texts was created from a subset Medical Notes Classification dataset. \nThe original dataset contains medical notes, which come from exactly one of the following five clinical domains: \nGastroenterology, Neurology, Orthopedics, Radiology, and Urology. There are 1239 texts in total in the original dataset.",
"## Dataset Details",
"### Dataset Description\n\n- Curated by: Liliya Makhmutova, Giancarlo Salton, Fernando Perez-Tellez, and Robert Ross\n- Funded by: Science Foundation Ireland under Grant number 18/CRT/6183 and the ADAPT SFI Research Centre for AI-Driven Digital Content Technology under Grant No. 13/RC/2106/P2\n- Language(s) (NLP): English\n- License: CC-BY-NC-4.0",
"### Dataset Sources [optional]\n\n\n\n- Repository: GitHub",
"## Uses\n\nShould not be used for any kind of patient treatment without medical professional supervision.",
"## Dataset Structure\n\n\n\nColumns in URL:\n- file_number (int): a filename (numeric) in the original dataset\n- line_number (int): the number of a sentence in the original dataset\n- original (str): sentence of the original text (preprocessed)\n- human_simplification (str): sentence of the text simplified by a human\n- chatgpt_simplification (str): sentence of the text simplified by the ChatGPT\n- images (list[str]): list of image file names related to a sentence that helps to understand a medical text",
"## Dataset Creation",
"### Curation Rationale\n\n\n\nMedical texts can be very difficult to understand for patients, which may lead to health problems. \nMore importantly, patients often don’t have access to their medical records, and where they have, \nthe patients often cannot understand the meaning due to the very different mental models and background \nknowledge that patients and clinicians have. \nThis leads to patients’ partial exclusion from the recovery process and sub-optimal outcomes. \nMedical texts usually contain lots of special terminology, many abbreviations, \nlack of coordination, subordination, and explanations in sentences making it harder to understand causal relationships. \nMoreover, medical texts usually consist of short ungrammatical sentences. \nThis makes their understanding difficult not only for laymen but also for healthcare professionals from other fields. \nGiven these challenges, a machine learning model for medical text simplifications \nmay be very beneficial both in terms of democratising information access and improving outcomes. \nAlthough a model under no circumstances should add irrelevant information (making up some facts), \nit may incorporate true knowledge that is not mentioned in a report to make a medical text clearer and more understandable for a patient. \nSo, for example, a model might add \"Your mother or father are likely to have similar conditions too\" \nexplaining \"genetic\" reasons, but should not judge whether the blood sugar level in a patient is normal or not.",
"### Source Data\n\nBased on the Medical Notes Classification dataset.",
"#### Data Collection and Processing\n\n\n\nThe texts from the Medical Notes Classification dataset \nwere preprocessed in the following way to create the original column:\n\n- HTML tags were removed\n- Minor typos were fixed\n- The texts were divided by sentences, each of which was enumerated\n\nSimplified text was created out of complex text under congruence (preserve the original information and not add extra information), \nfluency (readability), and simplicity principles.\nby a non-native English speaker with no medical or healthcare professional background. \nThe dataset hasn't been reviewed by a medical professional, the simplifications were created manually with the help of open resources \nincluding medical papers, surgerical videos explaining the procedures, and medical articles with simplified explanations. \nFor some sentences, pictorial explanations were added for a better understanding.\n\nAutomatically created simplifications were obtained through the use of the OpenAI chat interface where the following prompt was used: \n\"Please simplify the text so that non-professionals could understand it\". \nChatGPT tends to produce summarization rather than simplification on longer texts, so, for long texts (typically more than 20 sentences), \nthe text was inputted by parts (with the following prompt after the main one within the same chat context: \n\"Could you also simplify one more follow-up text so that non-professional could understand it: \\<NEXT PART OF THE COMPLEX TEXT\\>''). \nIt was decided not to add any examples or special prompting techniques for clarity reasons.",
"### Recommendations\n\n\n\n\n- Should not be used for any kind of patient treatment without medical professional supervision.\n- The dataset is not for commercial use.\n- If a model is trained on the dataset to solve a simplification task, it may produce hallucinations.\n\nThe outputs of ChatGPT from the chatgpt_simplification column were analysed and it was found that:\n\n1. ChatGPT can disclose abbreviations depending on the context.\n2. ChatGPT has a very good rewriting ability (this is related to both general language skills and the ability to understand and simplify medical terms).\n\nand \n\n1. ChatGPT tends to produce abstracts or summarizations rather than simplification on long texts.\n This may be explained by the limited length of the context.\n2. ChatGPT sometimes makes up some facts, which may be very dangerous in such a sensitive field as medicine. ChatGPT may even contradict its output.\n3. ChatGPT somehow lacks commonsense reasoning or medical \"knowledge\".\n4. ChatGPT may omit important facts or oversimplify.\n5. ChatGPT is biased towards rewriting a text by any means, even if it has been already quite simple. Sometimes the rewriting may change the meaning.\n ChatGPT also tends to produce more personal sentences.\n6. ChatGPT sometimes uses words such as \"a\", \"about\", \"some\", \"called\", rather than properly simplify a concept or explain.\n It also frequently outputs undersimplifications.\n\nExamples and analysis may be found on GitHub.",
"## Dataset Card Contact\n\nLiliya Makhmutova"
] |
c8a35b05f321e67d2e406b3bb56b4d3af6096cb6 | # Dataset Card for Dataset Name
This dataset was culled from the English-Spanish plain-text section of the United Nations Parallel Corpus.
## Dataset Sources
https://conferences.unite.un.org/UNCORPUS/Home/DownloadOverview
## Uses
This dataset can be used for various tasks in NLP, including but not limited to: Machine Translation, Cross-lingual Transfer Learning, Linguistic Research, etc.
## Dataset Card Contact
For any queries or contributions, please contact Okezie OKOYE at [email protected]. | okezieowen/english_to_spanish | [
"language:en",
"language:es",
"machine-translation",
"English",
"Spanish",
"region:us"
] | 2024-01-12T23:07:36+00:00 | {"language": ["en", "es"], "tags": ["machine-translation", "English", "Spanish"]} | 2024-01-12T23:49:02+00:00 | [] | [
"en",
"es"
] | TAGS
#language-English #language-Spanish #machine-translation #English #Spanish #region-us
| # Dataset Card for Dataset Name
This dataset was culled from the English-Spanish plain-text section of the United Nations Parallel Corpus.
## Dataset Sources
URL
## Uses
This dataset can be used for various tasks in NLP, including but not limited to: Machine Translation, Cross-lingual Transfer Learning, Linguistic Research, etc.
## Dataset Card Contact
For any queries or contributions, please contact Okezie OKOYE at okezieowen@URL. | [
"# Dataset Card for Dataset Name\nThis dataset was culled from the English-Spanish plain-text section of the United Nations Parallel Corpus.",
"## Dataset Sources\nURL",
"## Uses\nThis dataset can be used for various tasks in NLP, including but not limited to: Machine Translation, Cross-lingual Transfer Learning, Linguistic Research, etc.",
"## Dataset Card Contact\nFor any queries or contributions, please contact Okezie OKOYE at okezieowen@URL."
] | [
"TAGS\n#language-English #language-Spanish #machine-translation #English #Spanish #region-us \n",
"# Dataset Card for Dataset Name\nThis dataset was culled from the English-Spanish plain-text section of the United Nations Parallel Corpus.",
"## Dataset Sources\nURL",
"## Uses\nThis dataset can be used for various tasks in NLP, including but not limited to: Machine Translation, Cross-lingual Transfer Learning, Linguistic Research, etc.",
"## Dataset Card Contact\nFor any queries or contributions, please contact Okezie OKOYE at okezieowen@URL."
] |
94bbc559f1dc2efd7ad8a1917d45d91eef783a4a | # RadioModRec-1
<!-- Provide a quick summary of the dataset. -->
RadioModRec-1 is an Automatic Modulation Recognition (AMR) simulated dataset carefully curated for fifteen digital modulation schemes consisting of 4QAM, 16QAM, 64QAM, 256QAM, 8PSK, 16PSK, 32PSK, 64PSK, 128PSK, 256PSK, CPFSK, DBPSK, DQPSK, GFSK, and GMSK whose usefulness is predominantly found in modern wireless communication systems. RadioModRec-1 dataset caters for the Rayleigh and the Rician channel models under the Additive White Gaussian Noise (AWGN) from -20dB to +20dB at a step of +5dB.
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [Emmanuel Adetiba and Jamiu R. Olasina]
- **Funded by:** [Part Funding by Google Award for TensorFlow Outreaches in Colleges]
- **Language(s) (AMC):** [Automatic Modulation Recognition]
- **License:** [cc-by-nc-nd-4.0]
## Uses
RadioModRec-1 is a vital resource for state-of-the-art Automatic Modulation Recognition (AMR) research in Software Defined and Cognitive Radio Systems.<!-- Address questions around how the dataset is intended to be used. -->
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
Emmanuel Adetiba and Jamiu R. Olasina, RadioModRec: A Dataset for Automatic Modulation Recognition in Software Defined and Cognitive Radio Research.
## Dataset Card Authors [optional]
Emmanuel Adetiba
## Dataset Card Contact
[email protected]
[email protected] | aspmirlab/RadioModRec-1 | [
"task_categories:feature-extraction",
"language:en",
"license:cc-by-nc-nd-4.0",
"region:us"
] | 2024-01-12T23:07:49+00:00 | {"language": ["en"], "license": "cc-by-nc-nd-4.0", "task_categories": ["feature-extraction"]} | 2024-01-13T02:35:23+00:00 | [] | [
"en"
] | TAGS
#task_categories-feature-extraction #language-English #license-cc-by-nc-nd-4.0 #region-us
| # RadioModRec-1
RadioModRec-1 is an Automatic Modulation Recognition (AMR) simulated dataset carefully curated for fifteen digital modulation schemes consisting of 4QAM, 16QAM, 64QAM, 256QAM, 8PSK, 16PSK, 32PSK, 64PSK, 128PSK, 256PSK, CPFSK, DBPSK, DQPSK, GFSK, and GMSK whose usefulness is predominantly found in modern wireless communication systems. RadioModRec-1 dataset caters for the Rayleigh and the Rician channel models under the Additive White Gaussian Noise (AWGN) from -20dB to +20dB at a step of +5dB.
### Dataset Description
- Curated by: [Emmanuel Adetiba and Jamiu R. Olasina]
- Funded by: [Part Funding by Google Award for TensorFlow Outreaches in Colleges]
- Language(s) (AMC): [Automatic Modulation Recognition]
- License: [cc-by-nc-nd-4.0]
## Uses
RadioModRec-1 is a vital resource for state-of-the-art Automatic Modulation Recognition (AMR) research in Software Defined and Cognitive Radio Systems.
[optional]
Emmanuel Adetiba and Jamiu R. Olasina, RadioModRec: A Dataset for Automatic Modulation Recognition in Software Defined and Cognitive Radio Research.
## Dataset Card Authors [optional]
Emmanuel Adetiba
## Dataset Card Contact
aspmirlab@URL
emmanuel.adetiba@URL | [
"# RadioModRec-1\n\n\n\nRadioModRec-1 is an Automatic Modulation Recognition (AMR) simulated dataset carefully curated for fifteen digital modulation schemes consisting of 4QAM, 16QAM, 64QAM, 256QAM, 8PSK, 16PSK, 32PSK, 64PSK, 128PSK, 256PSK, CPFSK, DBPSK, DQPSK, GFSK, and GMSK whose usefulness is predominantly found in modern wireless communication systems. RadioModRec-1 dataset caters for the Rayleigh and the Rician channel models under the Additive White Gaussian Noise (AWGN) from -20dB to +20dB at a step of +5dB.",
"### Dataset Description\n\n\n\n- Curated by: [Emmanuel Adetiba and Jamiu R. Olasina]\n- Funded by: [Part Funding by Google Award for TensorFlow Outreaches in Colleges]\n- Language(s) (AMC): [Automatic Modulation Recognition]\n- License: [cc-by-nc-nd-4.0]",
"## Uses\n\nRadioModRec-1 is a vital resource for state-of-the-art Automatic Modulation Recognition (AMR) research in Software Defined and Cognitive Radio Systems.\n\n[optional]\n\nEmmanuel Adetiba and Jamiu R. Olasina, RadioModRec: A Dataset for Automatic Modulation Recognition in Software Defined and Cognitive Radio Research.",
"## Dataset Card Authors [optional]\n\nEmmanuel Adetiba",
"## Dataset Card Contact\n\naspmirlab@URL\nemmanuel.adetiba@URL"
] | [
"TAGS\n#task_categories-feature-extraction #language-English #license-cc-by-nc-nd-4.0 #region-us \n",
"# RadioModRec-1\n\n\n\nRadioModRec-1 is an Automatic Modulation Recognition (AMR) simulated dataset carefully curated for fifteen digital modulation schemes consisting of 4QAM, 16QAM, 64QAM, 256QAM, 8PSK, 16PSK, 32PSK, 64PSK, 128PSK, 256PSK, CPFSK, DBPSK, DQPSK, GFSK, and GMSK whose usefulness is predominantly found in modern wireless communication systems. RadioModRec-1 dataset caters for the Rayleigh and the Rician channel models under the Additive White Gaussian Noise (AWGN) from -20dB to +20dB at a step of +5dB.",
"### Dataset Description\n\n\n\n- Curated by: [Emmanuel Adetiba and Jamiu R. Olasina]\n- Funded by: [Part Funding by Google Award for TensorFlow Outreaches in Colleges]\n- Language(s) (AMC): [Automatic Modulation Recognition]\n- License: [cc-by-nc-nd-4.0]",
"## Uses\n\nRadioModRec-1 is a vital resource for state-of-the-art Automatic Modulation Recognition (AMR) research in Software Defined and Cognitive Radio Systems.\n\n[optional]\n\nEmmanuel Adetiba and Jamiu R. Olasina, RadioModRec: A Dataset for Automatic Modulation Recognition in Software Defined and Cognitive Radio Research.",
"## Dataset Card Authors [optional]\n\nEmmanuel Adetiba",
"## Dataset Card Contact\n\naspmirlab@URL\nemmanuel.adetiba@URL"
] |
801611132d134432b0344b04c1545da4fdc93e17 | The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS)
Creators
Livingstone, Steven R.1
ORCID icon
Russo, Frank A.2
ORCID icon
Description
Citing the RAVDESS
The RAVDESS is released under a Creative Commons Attribution license, so please cite the RAVDESS if it is used in your work in any form. Published academic papers should use the academic paper citation for our PLoS1 paper. Personal works, such as machine learning projects/blog posts, should provide a URL to this Zenodo page, though a reference to our PLoS1 paper would also be appreciated.
Academic paper citation
Livingstone SR, Russo FA (2018) The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLoS ONE 13(5): e0196391. https://doi.org/10.1371/journal.pone.0196391.
Personal use citation
Include a link to this Zenodo page - https://zenodo.org/record/1188976
Commercial Licenses
Commercial licenses for the RAVDESS can be purchased. For more information, please visit our license fee page, or contact us at [email protected].
Contact Information
If you would like further information about the RAVDESS, to purchase a commercial license, or if you experience any issues downloading files, please contact us at [email protected].
Example Videos
Watch a sample of the RAVDESS speech and song videos.
Emotion Classification Users
If you're interested in using machine learning to classify emotional expressions with the RAVDESS, please see our new RAVDESS Facial Landmark Tracking data set [Zenodo project page].
Construction and Validation
Full details on the construction and perceptual validation of the RAVDESS are described in our PLoS ONE paper - https://doi.org/10.1371/journal.pone.0196391.
The RAVDESS contains 7356 files. Each file was rated 10 times on emotional validity, intensity, and genuineness. Ratings were provided by 247 individuals who were characteristic of untrained adult research participants from North America. A further set of 72 participants provided test-retest data. High levels of emotional validity, interrater reliability, and test-retest intrarater reliability were reported. Validation data is open-access, and can be downloaded along with our paper from PLoS ONE.
Description
The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) contains 7356 files (total size: 24.8 GB). The database contains 24 professional actors (12 female, 12 male), vocalizing two lexically-matched statements in a neutral North American accent. Speech includes calm, happy, sad, angry, fearful, surprise, and disgust expressions, and song contains calm, happy, sad, angry, and fearful emotions. Each expression is produced at two levels of emotional intensity (normal, strong), with an additional neutral expression. All conditions are available in three modality formats: Audio-only (16bit, 48kHz .wav), Audio-Video (720p H.264, AAC 48kHz, .mp4), and Video-only (no sound). Note, there are no song files for Actor_18.
Audio-only files
Audio-only files of all actors (01-24) are available as two separate zip files (~200 MB each):
Speech file (Audio_Speech_Actors_01-24.zip, 215 MB) contains 1440 files: 60 trials per actor x 24 actors = 1440.
Song file (Audio_Song_Actors_01-24.zip, 198 MB) contains 1012 files: 44 trials per actor x 23 actors = 1012.
Audio-Visual and Video-only files
Video files are provided as separate zip downloads for each actor (01-24, ~500 MB each), and are split into separate speech and song downloads:
Speech files (Video_Speech_Actor_01.zip to Video_Speech_Actor_24.zip) collectively contains 2880 files: 60 trials per actor x 2 modalities (AV, VO) x 24 actors = 2880.
Song files (Video_Song_Actor_01.zip to Video_Song_Actor_24.zip) collectively contains 2024 files: 44 trials per actor x 2 modalities (AV, VO) x 23 actors = 2024.
File Summary
In total, the RAVDESS collection includes 7356 files (2880+2024+1440+1012 files).
File naming convention
Each of the 7356 RAVDESS files has a unique filename. The filename consists of a 7-part numerical identifier (e.g., 02-01-06-01-02-01-12.mp4). These identifiers define the stimulus characteristics:
Filename identifiers
Modality (01 = full-AV, 02 = video-only, 03 = audio-only).
Vocal channel (01 = speech, 02 = song).
Emotion (01 = neutral, 02 = calm, 03 = happy, 04 = sad, 05 = angry, 06 = fearful, 07 = disgust, 08 = surprised).
Emotional intensity (01 = normal, 02 = strong). NOTE: There is no strong intensity for the 'neutral' emotion.
Statement (01 = "Kids are talking by the door", 02 = "Dogs are sitting by the door").
Repetition (01 = 1st repetition, 02 = 2nd repetition).
Actor (01 to 24. Odd numbered actors are male, even numbered actors are female).
Filename example: 02-01-06-01-02-01-12.mp4
Video-only (02)
Speech (01)
Fearful (06)
Normal intensity (01)
Statement "dogs" (02)
1st Repetition (01)
12th Actor (12)
Female, as the actor ID number is even.
License information
The RAVDESS is released under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, CC BY-NC-SA 4.0
Commercial licenses for the RAVDESS can also be purchased. For more information, please visit our license fee page, or contact us at [email protected].
Related Data sets
RAVDESS Facial Landmark Tracking data set [Zenodo project page].
Dataset from https://zenodo.org/records/1188976
| birgermoell/ravdess | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-01-12T23:23:41+00:00 | {"license": "cc-by-nc-sa-4.0"} | 2024-01-12T23:30:49+00:00 | [] | [] | TAGS
#license-cc-by-nc-sa-4.0 #region-us
| The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS)
Creators
Livingstone, Steven R.1
ORCID icon
Russo, Frank A.2
ORCID icon
Description
Citing the RAVDESS
The RAVDESS is released under a Creative Commons Attribution license, so please cite the RAVDESS if it is used in your work in any form. Published academic papers should use the academic paper citation for our PLoS1 paper. Personal works, such as machine learning projects/blog posts, should provide a URL to this Zenodo page, though a reference to our PLoS1 paper would also be appreciated.
Academic paper citation
Livingstone SR, Russo FA (2018) The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLoS ONE 13(5): e0196391. URL
Personal use citation
Include a link to this Zenodo page - URL
Commercial Licenses
Commercial licenses for the RAVDESS can be purchased. For more information, please visit our license fee page, or contact us at ravdess@URL.
Contact Information
If you would like further information about the RAVDESS, to purchase a commercial license, or if you experience any issues downloading files, please contact us at ravdess@URL.
Example Videos
Watch a sample of the RAVDESS speech and song videos.
Emotion Classification Users
If you're interested in using machine learning to classify emotional expressions with the RAVDESS, please see our new RAVDESS Facial Landmark Tracking data set [Zenodo project page].
Construction and Validation
Full details on the construction and perceptual validation of the RAVDESS are described in our PLoS ONE paper - URL
The RAVDESS contains 7356 files. Each file was rated 10 times on emotional validity, intensity, and genuineness. Ratings were provided by 247 individuals who were characteristic of untrained adult research participants from North America. A further set of 72 participants provided test-retest data. High levels of emotional validity, interrater reliability, and test-retest intrarater reliability were reported. Validation data is open-access, and can be downloaded along with our paper from PLoS ONE.
Description
The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) contains 7356 files (total size: 24.8 GB). The database contains 24 professional actors (12 female, 12 male), vocalizing two lexically-matched statements in a neutral North American accent. Speech includes calm, happy, sad, angry, fearful, surprise, and disgust expressions, and song contains calm, happy, sad, angry, and fearful emotions. Each expression is produced at two levels of emotional intensity (normal, strong), with an additional neutral expression. All conditions are available in three modality formats: Audio-only (16bit, 48kHz .wav), Audio-Video (720p H.264, AAC 48kHz, .mp4), and Video-only (no sound). Note, there are no song files for Actor_18.
Audio-only files
Audio-only files of all actors (01-24) are available as two separate zip files (~200 MB each):
Speech file (Audio_Speech_Actors_01-URL, 215 MB) contains 1440 files: 60 trials per actor x 24 actors = 1440.
Song file (Audio_Song_Actors_01-URL, 198 MB) contains 1012 files: 44 trials per actor x 23 actors = 1012.
Audio-Visual and Video-only files
Video files are provided as separate zip downloads for each actor (01-24, ~500 MB each), and are split into separate speech and song downloads:
Speech files (Video_Speech_Actor_01.zip to Video_Speech_Actor_24.zip) collectively contains 2880 files: 60 trials per actor x 2 modalities (AV, VO) x 24 actors = 2880.
Song files (Video_Song_Actor_01.zip to Video_Song_Actor_24.zip) collectively contains 2024 files: 44 trials per actor x 2 modalities (AV, VO) x 23 actors = 2024.
File Summary
In total, the RAVDESS collection includes 7356 files (2880+2024+1440+1012 files).
File naming convention
Each of the 7356 RAVDESS files has a unique filename. The filename consists of a 7-part numerical identifier (e.g., 02-01-06-01-02-01-12.mp4). These identifiers define the stimulus characteristics:
Filename identifiers
Modality (01 = full-AV, 02 = video-only, 03 = audio-only).
Vocal channel (01 = speech, 02 = song).
Emotion (01 = neutral, 02 = calm, 03 = happy, 04 = sad, 05 = angry, 06 = fearful, 07 = disgust, 08 = surprised).
Emotional intensity (01 = normal, 02 = strong). NOTE: There is no strong intensity for the 'neutral' emotion.
Statement (01 = "Kids are talking by the door", 02 = "Dogs are sitting by the door").
Repetition (01 = 1st repetition, 02 = 2nd repetition).
Actor (01 to 24. Odd numbered actors are male, even numbered actors are female).
Filename example: 02-01-06-01-02-01-12.mp4
Video-only (02)
Speech (01)
Fearful (06)
Normal intensity (01)
Statement "dogs" (02)
1st Repetition (01)
12th Actor (12)
Female, as the actor ID number is even.
License information
The RAVDESS is released under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, CC BY-NC-SA 4.0
Commercial licenses for the RAVDESS can also be purchased. For more information, please visit our license fee page, or contact us at ravdess@URL.
Related Data sets
RAVDESS Facial Landmark Tracking data set [Zenodo project page].
Dataset from URL
| [] | [
"TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n"
] |
a80e9d10e0ff297e94ecc57da348dfccd52f13af | # Dataset Card for kz919/flan-50k-synthetic-reward-pretrained-mistral-7b-open-orca
## Dataset Description
- **License**: Apache-2.0
- **Pretty Name**: Ranking is generated by normalized inverse perplexity on each of the responses (Open-Orca/Mistral-7B-OpenOrca)
### Dataset Info
The dataset includes features essential for tasks related to response generation and ranking:
1. **prompt**: (string) - The original text prompt.
2. **completion**: (string) - The corresponding completion for each prompt.
3. **task**: (string) - Categorization or description of the task.
4. **ignos-Mistral-T5-7B-v1**: (string) - Responses from the ignos-Mistral-T5-7B-v1 model.
5. **cognAI-lil-c3po**: (string) - Responses from the cognAI-lil-c3po model.
6. **viethq188-Rabbit-7B-DPO-Chat**: (string) - Responses from the viethq188-Rabbit-7B-DPO-Chat model.
7. **cookinai-DonutLM-v1**: (string) - Responses from the cookinai-DonutLM-v1 model.
8. **v1olet-v1olet-merged-dpo-7B**: (string) - Responses from the v1olet-v1olet-merged-dpo-7B model.
9. **normalized_rewards**: (sequence of float32) - Normalized reward scores based on the inverse perplexity, calculated and ranked by [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca).
10. **router_label**: (int64) - Labels for routing the query to the most appropriate model.
### Ranking Methodology
- **Ranking Model**: [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)
- **Criteria**: The ranking is based on normalized inverse perplexity, a measure that assesses the fluency and relevance of the model responses in relation to the prompts.
### Splits
- **Train Split**:
- **num_bytes**: 105,157,970
- **num_examples**: 50,000
### Size
- **Download Size**: 48,848,643 bytes
- **Dataset Size**: 105,157,970 bytes
## Configurations
- **Config Name**: default
- **Data Files**:
- **Train Split**:
- **Path**: data/train-*
## Task Categories
- Text Classification
- Response Generation and Evaluation
## Language
- English (en)
## Size Category
- Medium (10K < n < 100K)
---
This dataset is particularly useful for developing and testing models in response generation tasks, offering a robust framework for comparing different AI models' performance. The unique ranking system based on Open-Orca/Mistral-7B-OpenOrca's normalized inverse perplexity provides an insightful metric for evaluating the fluency and relevance of responses in a wide range of conversational contexts. | kz919/open-orca-flan-50k-synthetic-reward-pretrained-mistral-7b-open-orca | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-13T00:15:30+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"], "pretty_name": "ranking is generated by noramlized inverse perplexity on each of the responses", "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "completion", "dtype": "string"}, {"name": "task", "dtype": "string"}, {"name": "ignos-Mistral-T5-7B-v1", "dtype": "string"}, {"name": "cognAI-lil-c3po", "dtype": "string"}, {"name": "viethq188-Rabbit-7B-DPO-Chat", "dtype": "string"}, {"name": "cookinai-DonutLM-v1", "dtype": "string"}, {"name": "v1olet-v1olet-merged-dpo-7B", "dtype": "string"}, {"name": "normalized_rewards", "sequence": "float32"}, {"name": "router_label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 105157970, "num_examples": 50000}], "download_size": 48848643, "dataset_size": 105157970}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T15:34:29+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
| # Dataset Card for kz919/flan-50k-synthetic-reward-pretrained-mistral-7b-open-orca
## Dataset Description
- License: Apache-2.0
- Pretty Name: Ranking is generated by normalized inverse perplexity on each of the responses (Open-Orca/Mistral-7B-OpenOrca)
### Dataset Info
The dataset includes features essential for tasks related to response generation and ranking:
1. prompt: (string) - The original text prompt.
2. completion: (string) - The corresponding completion for each prompt.
3. task: (string) - Categorization or description of the task.
4. ignos-Mistral-T5-7B-v1: (string) - Responses from the ignos-Mistral-T5-7B-v1 model.
5. cognAI-lil-c3po: (string) - Responses from the cognAI-lil-c3po model.
6. viethq188-Rabbit-7B-DPO-Chat: (string) - Responses from the viethq188-Rabbit-7B-DPO-Chat model.
7. cookinai-DonutLM-v1: (string) - Responses from the cookinai-DonutLM-v1 model.
8. v1olet-v1olet-merged-dpo-7B: (string) - Responses from the v1olet-v1olet-merged-dpo-7B model.
9. normalized_rewards: (sequence of float32) - Normalized reward scores based on the inverse perplexity, calculated and ranked by Open-Orca/Mistral-7B-OpenOrca.
10. router_label: (int64) - Labels for routing the query to the most appropriate model.
### Ranking Methodology
- Ranking Model: Open-Orca/Mistral-7B-OpenOrca
- Criteria: The ranking is based on normalized inverse perplexity, a measure that assesses the fluency and relevance of the model responses in relation to the prompts.
### Splits
- Train Split:
- num_bytes: 105,157,970
- num_examples: 50,000
### Size
- Download Size: 48,848,643 bytes
- Dataset Size: 105,157,970 bytes
## Configurations
- Config Name: default
- Data Files:
- Train Split:
- Path: data/train-*
## Task Categories
- Text Classification
- Response Generation and Evaluation
## Language
- English (en)
## Size Category
- Medium (10K < n < 100K)
---
This dataset is particularly useful for developing and testing models in response generation tasks, offering a robust framework for comparing different AI models' performance. The unique ranking system based on Open-Orca/Mistral-7B-OpenOrca's normalized inverse perplexity provides an insightful metric for evaluating the fluency and relevance of responses in a wide range of conversational contexts. | [
"# Dataset Card for kz919/flan-50k-synthetic-reward-pretrained-mistral-7b-open-orca",
"## Dataset Description\n\n- License: Apache-2.0\n- Pretty Name: Ranking is generated by normalized inverse perplexity on each of the responses (Open-Orca/Mistral-7B-OpenOrca)",
"### Dataset Info\n\nThe dataset includes features essential for tasks related to response generation and ranking:\n\n1. prompt: (string) - The original text prompt.\n2. completion: (string) - The corresponding completion for each prompt.\n3. task: (string) - Categorization or description of the task.\n4. ignos-Mistral-T5-7B-v1: (string) - Responses from the ignos-Mistral-T5-7B-v1 model.\n5. cognAI-lil-c3po: (string) - Responses from the cognAI-lil-c3po model.\n6. viethq188-Rabbit-7B-DPO-Chat: (string) - Responses from the viethq188-Rabbit-7B-DPO-Chat model.\n7. cookinai-DonutLM-v1: (string) - Responses from the cookinai-DonutLM-v1 model.\n8. v1olet-v1olet-merged-dpo-7B: (string) - Responses from the v1olet-v1olet-merged-dpo-7B model.\n9. normalized_rewards: (sequence of float32) - Normalized reward scores based on the inverse perplexity, calculated and ranked by Open-Orca/Mistral-7B-OpenOrca.\n10. router_label: (int64) - Labels for routing the query to the most appropriate model.",
"### Ranking Methodology\n\n- Ranking Model: Open-Orca/Mistral-7B-OpenOrca\n- Criteria: The ranking is based on normalized inverse perplexity, a measure that assesses the fluency and relevance of the model responses in relation to the prompts.",
"### Splits\n\n- Train Split: \n - num_bytes: 105,157,970\n - num_examples: 50,000",
"### Size\n\n- Download Size: 48,848,643 bytes\n- Dataset Size: 105,157,970 bytes",
"## Configurations\n\n- Config Name: default\n- Data Files: \n - Train Split:\n - Path: data/train-*",
"## Task Categories\n\n- Text Classification\n- Response Generation and Evaluation",
"## Language\n\n- English (en)",
"## Size Category\n\n- Medium (10K < n < 100K)\n\n---\n\nThis dataset is particularly useful for developing and testing models in response generation tasks, offering a robust framework for comparing different AI models' performance. The unique ranking system based on Open-Orca/Mistral-7B-OpenOrca's normalized inverse perplexity provides an insightful metric for evaluating the fluency and relevance of responses in a wide range of conversational contexts."
] | [
"TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n",
"# Dataset Card for kz919/flan-50k-synthetic-reward-pretrained-mistral-7b-open-orca",
"## Dataset Description\n\n- License: Apache-2.0\n- Pretty Name: Ranking is generated by normalized inverse perplexity on each of the responses (Open-Orca/Mistral-7B-OpenOrca)",
"### Dataset Info\n\nThe dataset includes features essential for tasks related to response generation and ranking:\n\n1. prompt: (string) - The original text prompt.\n2. completion: (string) - The corresponding completion for each prompt.\n3. task: (string) - Categorization or description of the task.\n4. ignos-Mistral-T5-7B-v1: (string) - Responses from the ignos-Mistral-T5-7B-v1 model.\n5. cognAI-lil-c3po: (string) - Responses from the cognAI-lil-c3po model.\n6. viethq188-Rabbit-7B-DPO-Chat: (string) - Responses from the viethq188-Rabbit-7B-DPO-Chat model.\n7. cookinai-DonutLM-v1: (string) - Responses from the cookinai-DonutLM-v1 model.\n8. v1olet-v1olet-merged-dpo-7B: (string) - Responses from the v1olet-v1olet-merged-dpo-7B model.\n9. normalized_rewards: (sequence of float32) - Normalized reward scores based on the inverse perplexity, calculated and ranked by Open-Orca/Mistral-7B-OpenOrca.\n10. router_label: (int64) - Labels for routing the query to the most appropriate model.",
"### Ranking Methodology\n\n- Ranking Model: Open-Orca/Mistral-7B-OpenOrca\n- Criteria: The ranking is based on normalized inverse perplexity, a measure that assesses the fluency and relevance of the model responses in relation to the prompts.",
"### Splits\n\n- Train Split: \n - num_bytes: 105,157,970\n - num_examples: 50,000",
"### Size\n\n- Download Size: 48,848,643 bytes\n- Dataset Size: 105,157,970 bytes",
"## Configurations\n\n- Config Name: default\n- Data Files: \n - Train Split:\n - Path: data/train-*",
"## Task Categories\n\n- Text Classification\n- Response Generation and Evaluation",
"## Language\n\n- English (en)",
"## Size Category\n\n- Medium (10K < n < 100K)\n\n---\n\nThis dataset is particularly useful for developing and testing models in response generation tasks, offering a robust framework for comparing different AI models' performance. The unique ranking system based on Open-Orca/Mistral-7B-OpenOrca's normalized inverse perplexity provides an insightful metric for evaluating the fluency and relevance of responses in a wide range of conversational contexts."
] |
2125157e8f7da9d9bf624d84a8247b068c2cd2d6 |
contains ~1K words and a reputable list of synonyms. | ryanbaker/synonyms_1K | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T00:17:44+00:00 | {"license": "apache-2.0"} | 2024-01-13T01:56:02+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
contains ~1K words and a reputable list of synonyms. | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
b1b448db11c25d6b23f2482d5bdf3f979ef9ae1f |
# Dataset of sodom_s_beast_draco/ソドムズビースト/ドラコー/所多玛之兽/德拉科 (Fate/Grand Order)
This is the dataset of sodom_s_beast_draco/ソドムズビースト/ドラコー/所多玛之兽/德拉科 (Fate/Grand Order), containing 179 images and their tags.
The core tags of this character are `blonde_hair, red_eyes, bangs, breasts, hair_intakes, ahoge, facial_mark, braid, hair_bun, long_hair, single_hair_bun, crown, tail, french_braid, dragon_tail, ribbon, hair_ribbon, small_breasts, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 179 | 352.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sodom_s_beast_draco_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 179 | 171.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sodom_s_beast_draco_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 472 | 392.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sodom_s_beast_draco_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 179 | 298.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sodom_s_beast_draco_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 472 | 612.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sodom_s_beast_draco_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sodom_s_beast_draco_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, criss-cross_halter, elbow_gloves, looking_at_viewer, red_bikini, scales, solo, white_gloves, white_robe, navel, bare_shoulders, open_clothes, smile, thighs, open_mouth |
| 1 | 15 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, white_gloves, elbow_gloves, red_dress, scales, halterneck, chalice, smile |
| 2 | 13 |  |  |  |  |  | 1girl, animal_ears, large_breasts, solo, looking_at_viewer, smile, claws, open_mouth, cleavage, navel, animal_ear_fluff, teeth, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | criss-cross_halter | elbow_gloves | looking_at_viewer | red_bikini | scales | solo | white_gloves | white_robe | navel | bare_shoulders | open_clothes | smile | thighs | open_mouth | red_dress | halterneck | chalice | animal_ears | large_breasts | claws | cleavage | animal_ear_fluff | teeth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:---------------|:--------------------|:-------------|:---------|:-------|:---------------|:-------------|:--------|:-----------------|:---------------|:--------|:---------|:-------------|:------------|:-------------|:----------|:--------------|:----------------|:--------|:-----------|:-------------------|:--------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | | X | X | | X | X | X | | | X | | X | | | X | X | X | | | | | | |
| 2 | 13 |  |  |  |  |  | X | | | X | | | X | | | X | | | X | X | X | | | | X | X | X | X | X | X |
| CyberHarem/sodom_s_beast_draco_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T01:12:33+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T01:59:34+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of sodom\_s\_beast\_draco/ソドムズビースト/ドラコー/所多玛之兽/德拉科 (Fate/Grand Order)
============================================================================
This is the dataset of sodom\_s\_beast\_draco/ソドムズビースト/ドラコー/所多玛之兽/德拉科 (Fate/Grand Order), containing 179 images and their tags.
The core tags of this character are 'blonde\_hair, red\_eyes, bangs, breasts, hair\_intakes, ahoge, facial\_mark, braid, hair\_bun, long\_hair, single\_hair\_bun, crown, tail, french\_braid, dragon\_tail, ribbon, hair\_ribbon, small\_breasts, red\_ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
d70388ca07111ff70f94be7a7d18cf95df961027 |
## Haruhi-Zero的Conversation训练数据
我们计划拓展ChatHaruhi,从Few-shot到Zero-shot,这个数据集记录使用各个(中文)角色扮演api进行Baize式相互聊天后得到的数据结果
ids代表聊天的时候两张bot的角色卡片, 角色卡片的信息可以在https://huggingface.co/datasets/silk-road/Haruhi-Zero-RolePlaying-movie-PIPPA 中找到
并且对于第一次出现的id0,也会在prompt字段中进行记录。
聊天的时候id和ids的卡片进行对应
- openai 代表两个聊天的bot都使用openai
- GLM 代表两个聊天的bot都使用CharacterGLM
- Claude 代表两个聊天的bot都使用Claude
- Claude_openai 代表id0的使用Claude, id1的使用openai
- Baichuan 代表两个聊天的bot都使用Character-Baichuan-Turbo
目前百川有很严重的访问限制,如果谁有并发更大的百川的接口,可以联系我们借用一下(邮箱[email protected])
或者跑下面的代码,(start_id = 10590 end_id = 12708)反馈给我
https://github.com/LC1332/Zero-Haruhi/blob/main/notebook/GenerateBaizeBaichuan.ipynb
目前平衡生成时间、成本和效果来看,最终训练准备先采用openai和Claude_openai, 这两者已经采集了15000/2000的数据,正在进一步生成更多数据
主项目链接
https://github.com/LC1332/Chat-Haruhi-Suzumiya
## API和服务器
如果你有OpenAI、Claude或者Character-Baichuan的api资源 可以参与进来的话,方便联系我一下 发邮件或者在知乎 https://www.zhihu.com/people/cheng-li-47 留一下您的微信
如果你有足够的训练资源去tuning 13B以及Yi 34B规模的模型(2000长度),也可以联系我加入到项目后面的训练中。
如果你能够组织Human Feedback的比较标注,也可以联系我们
整体计划
https://o9z6tor1qu.feishu.cn/docx/LxTWdGnP2oQ0oUx8H0wcmyZCnrb
后期如果有空的话可以进一步拿各家api的结果做一下RLHF或者DPO。
| silk-road/Haruhi-Baize-Role-Playing-Conversation | [
"task_categories:text-generation",
"language:zh",
"license:cc-by-4.0",
"region:us"
] | 2024-01-13T01:24:51+00:00 | {"language": ["zh"], "license": "cc-by-4.0", "task_categories": ["text-generation"]} | 2024-01-15T01:29:38+00:00 | [] | [
"zh"
] | TAGS
#task_categories-text-generation #language-Chinese #license-cc-by-4.0 #region-us
|
## Haruhi-Zero的Conversation训练数据
我们计划拓展ChatHaruhi,从Few-shot到Zero-shot,这个数据集记录使用各个(中文)角色扮演api进行Baize式相互聊天后得到的数据结果
ids代表聊天的时候两张bot的角色卡片, 角色卡片的信息可以在https://URL 中找到
并且对于第一次出现的id0,也会在prompt字段中进行记录。
聊天的时候id和ids的卡片进行对应
- openai 代表两个聊天的bot都使用openai
- GLM 代表两个聊天的bot都使用CharacterGLM
- Claude 代表两个聊天的bot都使用Claude
- Claude_openai 代表id0的使用Claude, id1的使用openai
- Baichuan 代表两个聊天的bot都使用Character-Baichuan-Turbo
目前百川有很严重的访问限制,如果谁有并发更大的百川的接口,可以联系我们借用一下(邮箱chengli.thu@URL)
或者跑下面的代码,(start_id = 10590 end_id = 12708)反馈给我
URL
目前平衡生成时间、成本和效果来看,最终训练准备先采用openai和Claude_openai, 这两者已经采集了15000/2000的数据,正在进一步生成更多数据
主项目链接
URL
## API和服务器
如果你有OpenAI、Claude或者Character-Baichuan的api资源 可以参与进来的话,方便联系我一下 发邮件或者在知乎 URL 留一下您的微信
如果你有足够的训练资源去tuning 13B以及Yi 34B规模的模型(2000长度),也可以联系我加入到项目后面的训练中。
如果你能够组织Human Feedback的比较标注,也可以联系我们
整体计划
URL
后期如果有空的话可以进一步拿各家api的结果做一下RLHF或者DPO。
| [
"## Haruhi-Zero的Conversation训练数据\n\n我们计划拓展ChatHaruhi,从Few-shot到Zero-shot,这个数据集记录使用各个(中文)角色扮演api进行Baize式相互聊天后得到的数据结果\n\nids代表聊天的时候两张bot的角色卡片, 角色卡片的信息可以在https://URL 中找到\n\n并且对于第一次出现的id0,也会在prompt字段中进行记录。\n\n聊天的时候id和ids的卡片进行对应\n\n- openai 代表两个聊天的bot都使用openai\n- GLM 代表两个聊天的bot都使用CharacterGLM\n- Claude 代表两个聊天的bot都使用Claude\n- Claude_openai 代表id0的使用Claude, id1的使用openai\n- Baichuan 代表两个聊天的bot都使用Character-Baichuan-Turbo\n\n目前百川有很严重的访问限制,如果谁有并发更大的百川的接口,可以联系我们借用一下(邮箱chengli.thu@URL)\n\n或者跑下面的代码,(start_id = 10590 end_id = 12708)反馈给我\n\nURL\n\n目前平衡生成时间、成本和效果来看,最终训练准备先采用openai和Claude_openai, 这两者已经采集了15000/2000的数据,正在进一步生成更多数据\n\n主项目链接\n\nURL",
"## API和服务器\n\n如果你有OpenAI、Claude或者Character-Baichuan的api资源 可以参与进来的话,方便联系我一下 发邮件或者在知乎 URL 留一下您的微信\n\n如果你有足够的训练资源去tuning 13B以及Yi 34B规模的模型(2000长度),也可以联系我加入到项目后面的训练中。\n\n如果你能够组织Human Feedback的比较标注,也可以联系我们\n\n整体计划\n\nURL\n\n后期如果有空的话可以进一步拿各家api的结果做一下RLHF或者DPO。"
] | [
"TAGS\n#task_categories-text-generation #language-Chinese #license-cc-by-4.0 #region-us \n",
"## Haruhi-Zero的Conversation训练数据\n\n我们计划拓展ChatHaruhi,从Few-shot到Zero-shot,这个数据集记录使用各个(中文)角色扮演api进行Baize式相互聊天后得到的数据结果\n\nids代表聊天的时候两张bot的角色卡片, 角色卡片的信息可以在https://URL 中找到\n\n并且对于第一次出现的id0,也会在prompt字段中进行记录。\n\n聊天的时候id和ids的卡片进行对应\n\n- openai 代表两个聊天的bot都使用openai\n- GLM 代表两个聊天的bot都使用CharacterGLM\n- Claude 代表两个聊天的bot都使用Claude\n- Claude_openai 代表id0的使用Claude, id1的使用openai\n- Baichuan 代表两个聊天的bot都使用Character-Baichuan-Turbo\n\n目前百川有很严重的访问限制,如果谁有并发更大的百川的接口,可以联系我们借用一下(邮箱chengli.thu@URL)\n\n或者跑下面的代码,(start_id = 10590 end_id = 12708)反馈给我\n\nURL\n\n目前平衡生成时间、成本和效果来看,最终训练准备先采用openai和Claude_openai, 这两者已经采集了15000/2000的数据,正在进一步生成更多数据\n\n主项目链接\n\nURL",
"## API和服务器\n\n如果你有OpenAI、Claude或者Character-Baichuan的api资源 可以参与进来的话,方便联系我一下 发邮件或者在知乎 URL 留一下您的微信\n\n如果你有足够的训练资源去tuning 13B以及Yi 34B规模的模型(2000长度),也可以联系我加入到项目后面的训练中。\n\n如果你能够组织Human Feedback的比较标注,也可以联系我们\n\n整体计划\n\nURL\n\n后期如果有空的话可以进一步拿各家api的结果做一下RLHF或者DPO。"
] |
63dd84d560b03c2d97b2f2f67bf60581a9f35b20 |
# Dataset of owari/尾張/尾张 (Azur Lane)
This is the dataset of owari/尾張/尾张 (Azur Lane), containing 297 images and their tags.
The core tags of this character are `breasts, long_hair, braid, hair_over_one_eye, horns, large_breasts, yellow_eyes, blonde_hair, mole, twin_braids, dark_skin, bangs, earrings, hair_ornament, very_long_hair, dark-skinned_female, mole_under_mouth, hairclip, huge_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 297 | 568.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/owari_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 297 | 274.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/owari_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 795 | 624.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/owari_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 297 | 478.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/owari_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 795 | 952.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/owari_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/owari_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_horns, blush, cleavage, collarbone, grin, jewelry, looking_at_viewer, solo, thighs, bare_shoulders, indoors, nurse_cap, white_dress, black_choker, demon_horns, sitting, teeth, armband, bed, cross, panties |
| 1 | 11 |  |  |  |  |  | 1girl, grin, looking_at_viewer, solo, black_gloves, jewelry, cleavage, blush, choker, upper_body, fishnets, virtual_youtuber, white_background, white_hair, bare_shoulders, simple_background |
| 2 | 26 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, bare_shoulders, black_skirt, grin, pleated_skirt, jewelry, black_gloves, white_background, fishnet_thighhighs, thighs, blush, simple_background, choker, black_thighhighs, elbow_gloves, wide_sleeves |
| 3 | 8 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, thighs, water, white_one-piece_swimsuit, blush, bracelet, grin, necklace, sitting, white_hair |
| 4 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, jewelry, nipples, penis, solo_focus, blush, mosaic_censoring, smile, navel, spread_legs, sweat, looking_at_viewer, nude, open_mouth, pussy, sex, vaginal, collarbone, missionary, on_back, pillow, teeth, white_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_horns | blush | cleavage | collarbone | grin | jewelry | looking_at_viewer | solo | thighs | bare_shoulders | indoors | nurse_cap | white_dress | black_choker | demon_horns | sitting | teeth | armband | bed | cross | panties | black_gloves | choker | upper_body | fishnets | virtual_youtuber | white_background | white_hair | simple_background | black_skirt | pleated_skirt | fishnet_thighhighs | black_thighhighs | elbow_gloves | wide_sleeves | water | white_one-piece_swimsuit | bracelet | necklace | 1boy | hetero | nipples | penis | solo_focus | mosaic_censoring | smile | navel | spread_legs | sweat | nude | open_mouth | pussy | sex | vaginal | missionary | on_back | pillow |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------|:-----------|:-------------|:-------|:----------|:--------------------|:-------|:---------|:-----------------|:----------|:------------|:--------------|:---------------|:--------------|:----------|:--------|:----------|:------|:--------|:----------|:---------------|:---------|:-------------|:-----------|:-------------------|:-------------------|:-------------|:--------------------|:--------------|:----------------|:---------------------|:-------------------|:---------------|:---------------|:--------|:---------------------------|:-----------|:-----------|:-------|:---------|:----------|:--------|:-------------|:-------------------|:--------|:--------|:--------------|:--------|:-------|:-------------|:--------|:------|:----------|:-------------|:----------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | X | X | | X | X | X | X | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 26 |  |  |  |  |  | X | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | X | X | | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | X | X | | X | | X | X | X | | | | | | | X | | | | | | | | | | | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | | X | | X | X | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/owari_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T01:56:44+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T03:19:00+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of owari/尾張/尾张 (Azur Lane)
==================================
This is the dataset of owari/尾張/尾张 (Azur Lane), containing 297 images and their tags.
The core tags of this character are 'breasts, long\_hair, braid, hair\_over\_one\_eye, horns, large\_breasts, yellow\_eyes, blonde\_hair, mole, twin\_braids, dark\_skin, bangs, earrings, hair\_ornament, very\_long\_hair, dark-skinned\_female, mole\_under\_mouth, hairclip, huge\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
bf9ac39141a93e02054050ff21400b882c6a45bf | A subset of [hanruijiang/civitai-stable-diffusion-2.5m](https://huggingface.co/datasets/hanruijiang/civitai-stable-diffusion-2.5m) database from Civitai API containing only hash, url, nsfwLevel, nsfw, stats, prompt, negativePrompt fields. All items of the dataset have a prompt/negativePrompt populated. It contains exactly 2129933 items. | AdamCodd/Civitai-2m-prompts | [
"task_categories:text-generation",
"task_categories:image-classification",
"size_categories:1M<n<10M",
"language:en",
"art",
"region:us"
] | 2024-01-13T01:58:51+00:00 | {"language": ["en"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "image-classification"], "tags": ["art"]} | 2024-01-13T02:18:28+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-image-classification #size_categories-1M<n<10M #language-English #art #region-us
| A subset of hanruijiang/civitai-stable-diffusion-2.5m database from Civitai API containing only hash, url, nsfwLevel, nsfw, stats, prompt, negativePrompt fields. All items of the dataset have a prompt/negativePrompt populated. It contains exactly 2129933 items. | [] | [
"TAGS\n#task_categories-text-generation #task_categories-image-classification #size_categories-1M<n<10M #language-English #art #region-us \n"
] |
7a1d10e08e7c9be5b9a1a65ebef15dc2a5efed35 |
Summaries for random Wikipedia articles of varying lengths, in fastchat JSON format, generated by `gpt-4-1106-preview`. OpenAI terms apply.
This was designed to train a 32K context-length model. Check the total conversation lengths before using data items for training to ensure that they fit inside your target context window, and discard any that don't fit.
The summary requests were randomly selected from the following types:
- Standard detailed summary
- Summary as a bulleted list
- Summary in tabular form (markdown table)
- Summary in ELI5 form ('explain it like I'm 5')
In addition, summary inputs could be a single article, or a series of (shorter) articles presented one by one as independent documents in the same prompt. In the latter case, the output will include the summary of each input document, in order, with sub-headings.
The wording for each summarization request was randomized, and the position was also randomly selected (before the article(s) or after).
The Wikipedia articles themselves were converted to text and augmented/modified in various random ways (sub-headings removed, bullets removed, citations/background removed, etc.) | grimulkan/wikipedia-summaries | [
"license:unknown",
"region:us"
] | 2024-01-13T02:33:45+00:00 | {"license": "unknown"} | 2024-01-13T02:42:18+00:00 | [] | [] | TAGS
#license-unknown #region-us
|
Summaries for random Wikipedia articles of varying lengths, in fastchat JSON format, generated by 'gpt-4-1106-preview'. OpenAI terms apply.
This was designed to train a 32K context-length model. Check the total conversation lengths before using data items for training to ensure that they fit inside your target context window, and discard any that don't fit.
The summary requests were randomly selected from the following types:
- Standard detailed summary
- Summary as a bulleted list
- Summary in tabular form (markdown table)
- Summary in ELI5 form ('explain it like I'm 5')
In addition, summary inputs could be a single article, or a series of (shorter) articles presented one by one as independent documents in the same prompt. In the latter case, the output will include the summary of each input document, in order, with sub-headings.
The wording for each summarization request was randomized, and the position was also randomly selected (before the article(s) or after).
The Wikipedia articles themselves were converted to text and augmented/modified in various random ways (sub-headings removed, bullets removed, citations/background removed, etc.) | [] | [
"TAGS\n#license-unknown #region-us \n"
] |
23104dc275bed7132cf98c77f480fedffccde751 |
This was meant to be training data to teach an LLM to do some basic document editing tasks.
# File: wikipedia_word_sub.json
**Input:** 150 Wikipedia articles + A request to substitute one word for another (usually a synonym)
**Output:** The same article, with the word substituted as requested
**Format:** Fastchat
# File: wikipedia_err_correct.json
**Input:** 224 Wikipedia articles with typos and other errors introduced randomly using the python [typo library](https://pypi.org/project/typo/) + A request to fix errors
**Output:** The original article (presumably, without errors)
**Format:** Fastchat
| grimulkan/document-editing | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T02:43:52+00:00 | {"license": "apache-2.0"} | 2024-01-13T02:51:44+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
This was meant to be training data to teach an LLM to do some basic document editing tasks.
# File: wikipedia_word_sub.json
Input: 150 Wikipedia articles + A request to substitute one word for another (usually a synonym)
Output: The same article, with the word substituted as requested
Format: Fastchat
# File: wikipedia_err_correct.json
Input: 224 Wikipedia articles with typos and other errors introduced randomly using the python typo library + A request to fix errors
Output: The original article (presumably, without errors)
Format: Fastchat
| [
"# File: wikipedia_word_sub.json\n\nInput: 150 Wikipedia articles + A request to substitute one word for another (usually a synonym)\n\nOutput: The same article, with the word substituted as requested\n\nFormat: Fastchat",
"# File: wikipedia_err_correct.json\n\nInput: 224 Wikipedia articles with typos and other errors introduced randomly using the python typo library + A request to fix errors\n\nOutput: The original article (presumably, without errors)\n\nFormat: Fastchat"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# File: wikipedia_word_sub.json\n\nInput: 150 Wikipedia articles + A request to substitute one word for another (usually a synonym)\n\nOutput: The same article, with the word substituted as requested\n\nFormat: Fastchat",
"# File: wikipedia_err_correct.json\n\nInput: 224 Wikipedia articles with typos and other errors introduced randomly using the python typo library + A request to fix errors\n\nOutput: The original article (presumably, without errors)\n\nFormat: Fastchat"
] |
3896c305fab79ddbd04821b6451ec714a62f566b |
# Dataset of uesugi_kenshin/上杉謙信/上杉谦信 (Fate/Grand Order)
This is the dataset of uesugi_kenshin/上杉謙信/上杉谦信 (Fate/Grand Order), containing 47 images and their tags.
The core tags of this character are `long_hair, multicolored_hair, black_hair, breasts, two-tone_hair, white_hair, hair_between_eyes, bangs, very_long_hair, streaked_hair, green_eyes, yellow_eyes, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 47 | 72.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uesugi_kenshin_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 47 | 36.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uesugi_kenshin_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 119 | 81.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uesugi_kenshin_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 47 | 60.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uesugi_kenshin_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 119 | 124.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uesugi_kenshin_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/uesugi_kenshin_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 |  |  |  |  |  | 1girl, crop_top, midriff, navel, smile, looking_at_viewer, solo, bare_shoulders, black_shirt, off_shoulder, white_jacket, white_shorts, open_mouth, sleeveless_shirt, short_shorts, open_jacket, simple_background, blush, long_sleeves, thighs, large_breasts, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | crop_top | midriff | navel | smile | looking_at_viewer | solo | bare_shoulders | black_shirt | off_shoulder | white_jacket | white_shorts | open_mouth | sleeveless_shirt | short_shorts | open_jacket | simple_background | blush | long_sleeves | thighs | large_breasts | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:----------|:--------|:--------|:--------------------|:-------|:-----------------|:--------------|:---------------|:---------------|:---------------|:-------------|:-------------------|:---------------|:--------------|:--------------------|:--------|:---------------|:---------|:----------------|:-------------------|
| 0 | 34 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/uesugi_kenshin_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T02:46:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T02:56:08+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of uesugi\_kenshin/上杉謙信/上杉谦信 (Fate/Grand Order)
=======================================================
This is the dataset of uesugi\_kenshin/上杉謙信/上杉谦信 (Fate/Grand Order), containing 47 images and their tags.
The core tags of this character are 'long\_hair, multicolored\_hair, black\_hair, breasts, two-tone\_hair, white\_hair, hair\_between\_eyes, bangs, very\_long\_hair, streaked\_hair, green\_eyes, yellow\_eyes, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
4a04d4621b19d1f2395a178d7ee763d4d0157a33 |
Q&A testing theory of mind, in Alpaca format, generated by `gpt-4-1106-preview`. OpenAI terms apply.
Each answer was double-checked by `gpt-4-1106-preview`, and suspicious answers were removed, since even GPT4 struggles with accuracy in this test. This does not guarantee that the remaining entries are correct, but the accuracy should be better than base.
**Files:**
- `theory_of_mind.json` Usual, double-checked TOM Q&A (150 entries)
- `theory_of_mind_longer.json` Slightly longer Q&A (50 entries)
- `theory_of_mind_airoboros_fixed.json` Double-checked version of only the theory of mind data entries in the [Airoboros dataset](https://huggingface.co/datasets/jondurbin/airoboros-3.1) (339 entries, GPT4 re-generated/corrected many of them, though that doesn't mean they were incorrect to begin with)
| grimulkan/theory-of-mind | [
"license:unknown",
"region:us"
] | 2024-01-13T02:56:18+00:00 | {"license": "unknown"} | 2024-01-13T22:33:53+00:00 | [] | [] | TAGS
#license-unknown #region-us
|
Q&A testing theory of mind, in Alpaca format, generated by 'gpt-4-1106-preview'. OpenAI terms apply.
Each answer was double-checked by 'gpt-4-1106-preview', and suspicious answers were removed, since even GPT4 struggles with accuracy in this test. This does not guarantee that the remaining entries are correct, but the accuracy should be better than base.
Files:
- 'theory_of_mind.json' Usual, double-checked TOM Q&A (150 entries)
- 'theory_of_mind_longer.json' Slightly longer Q&A (50 entries)
- 'theory_of_mind_airoboros_fixed.json' Double-checked version of only the theory of mind data entries in the Airoboros dataset (339 entries, GPT4 re-generated/corrected many of them, though that doesn't mean they were incorrect to begin with)
| [] | [
"TAGS\n#license-unknown #region-us \n"
] |
727c89f9769f7c319089334274cda187aa09b50d |
Q&A testing physical reasoning, in Alpaca format, generated by `gpt-4-1106-preview`. OpenAI terms apply.
Each answer was double-checked by `gpt-4-1106-preview`, and suspicious answers were removed, since even GPT4 struggles with accuracy in this test. This does not guarantee that the remaining entries are correct, but the accuracy should be better than base.
The types of questions ranged from line of sight problems (who/what is visible from where in various situations), temperature-related questions, pressure-related questions, gravitational effects, etc.
**Files:**
- `physical_reasoning.json` Double-checked physical reasoning questions based on the natural world (500 entries)
- `physical_reasoning_longer.json` Slightly longer Q&A (149 entries)
- `physical_reasoning_magic.json` Same, but assigns magical properties to the world and tests the resulting reasoning (egs., imagine a mirror that only shows the reflection of what happened 10 seconds ago...) (250 entries)
| grimulkan/physical-reasoning | [
"license:unknown",
"region:us"
] | 2024-01-13T03:02:28+00:00 | {"license": "unknown"} | 2024-01-13T22:35:13+00:00 | [] | [] | TAGS
#license-unknown #region-us
|
Q&A testing physical reasoning, in Alpaca format, generated by 'gpt-4-1106-preview'. OpenAI terms apply.
Each answer was double-checked by 'gpt-4-1106-preview', and suspicious answers were removed, since even GPT4 struggles with accuracy in this test. This does not guarantee that the remaining entries are correct, but the accuracy should be better than base.
The types of questions ranged from line of sight problems (who/what is visible from where in various situations), temperature-related questions, pressure-related questions, gravitational effects, etc.
Files:
- 'physical_reasoning.json' Double-checked physical reasoning questions based on the natural world (500 entries)
- 'physical_reasoning_longer.json' Slightly longer Q&A (149 entries)
- 'physical_reasoning_magic.json' Same, but assigns magical properties to the world and tests the resulting reasoning (egs., imagine a mirror that only shows the reflection of what happened 10 seconds ago...) (250 entries)
| [] | [
"TAGS\n#license-unknown #region-us \n"
] |
c5bab768699c68adca1e43588405f3fda251211e |
Q&A testing relationships between people, in Alpaca format, generated by `gpt-4-1106-preview`. OpenAI terms apply.
Each answer was double-checked by `gpt-4-1106-preview`, and suspicious answers were removed, since even GPT4 struggles with accuracy in this test. This does not guarantee that the remaining entries are correct, but the accuracy should be better than base.
Egs., Questions of family relations (X has 1 son A, X's sister has 1 son B, how are A and B related and why?) & hierarchical relations (egs., in the workplace).
150 entries in Alpaca format.
| grimulkan/interpersonal-relational-reasoning | [
"license:unknown",
"region:us"
] | 2024-01-13T03:10:45+00:00 | {"license": "unknown"} | 2024-01-13T22:32:08+00:00 | [] | [] | TAGS
#license-unknown #region-us
|
Q&A testing relationships between people, in Alpaca format, generated by 'gpt-4-1106-preview'. OpenAI terms apply.
Each answer was double-checked by 'gpt-4-1106-preview', and suspicious answers were removed, since even GPT4 struggles with accuracy in this test. This does not guarantee that the remaining entries are correct, but the accuracy should be better than base.
Egs., Questions of family relations (X has 1 son A, X's sister has 1 son B, how are A and B related and why?) & hierarchical relations (egs., in the workplace).
150 entries in Alpaca format.
| [] | [
"TAGS\n#license-unknown #region-us \n"
] |
38c5e6eb6ff58923340948f2f2edac2ceeda7fe2 |
This is a version of [bluemoon-fandom-1-1-rp-cleaned](https://huggingface.co/datasets/Squish42/bluemoon-fandom-1-1-rp-cleaned) further cleaned up using [Karen_TheEditor 13B](https://huggingface.co/FPHam/Karen_theEditor_13b_HF), in Fastchat format.
I tried to fix as many of the grammatical issues as possible and didn't drop any conversations, but there are still issues since Karen is not perfect (and only 13B).
If I detected any large deviations from the original text in the corrections, I fell back to a standard spell-checker, excluding estimated proper nouns from the spell-checker (which is also not perfect).
I intended this as a first pass and mainly wanted to test an automated way to clean language data sets, but I never got around to doing a better job. Still, it is probably better than most other versions out there (for now). | grimulkan/bluemoon_Karen_cleaned | [
"license:apache-2.0",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T03:42:38+00:00 | {"license": "apache-2.0", "tags": ["not-for-all-audiences"]} | 2024-01-24T00:01:34+00:00 | [] | [] | TAGS
#license-apache-2.0 #not-for-all-audiences #region-us
|
This is a version of bluemoon-fandom-1-1-rp-cleaned further cleaned up using Karen_TheEditor 13B, in Fastchat format.
I tried to fix as many of the grammatical issues as possible and didn't drop any conversations, but there are still issues since Karen is not perfect (and only 13B).
If I detected any large deviations from the original text in the corrections, I fell back to a standard spell-checker, excluding estimated proper nouns from the spell-checker (which is also not perfect).
I intended this as a first pass and mainly wanted to test an automated way to clean language data sets, but I never got around to doing a better job. Still, it is probably better than most other versions out there (for now). | [] | [
"TAGS\n#license-apache-2.0 #not-for-all-audiences #region-us \n"
] |
d633bbceb19a034b87ed8ce2e609d98023033fb9 |
Passkey retrieval training/evaluation data in Fastchat format. You will have to split into train/evaluation manually.
- Articles were drawn from [Long C4](https://huggingface.co/datasets/vllg/long_c4) in varying lengths
- A secret passkey was inserted somewhere in the article, randomly.
- The name and type of secret is randomly varied (passphrase, secret key, specific fact, favorite colors, password, etc.) and the passkey itself was randomly generated based on various proper nouns ([Faker Library](https://pypi.org/project/Faker/)), words/phrases of varying lengths ([WonderWords Library](https://pypi.org/project/wonderwords/)), etc.
- A note to remember the passkey/fact above was added with 50% probability.
- With 15% probability, there was no passkey/fact included and the response indicates that no such information exists.
There are a number of files in the format: `c4_passkey_XXYY.json`.
`XX` is the approximate length of the input prompt in ChatGPT `tiktoken` tokens. It is very approximate, and may translate to different numbers for Llama. Approx. context lengths of 8K, 10K, 16K and 24K are available (24K roughly corresponds to 30K Llama2 tokens).
If `YY` is blank, it includes not just a query for the passkey/fact, but also some follow-up multi-round questions about the surrounding context, which line it is present in, etc.
If `YY` is `_nocontext`, then it is purely a single Q and A, with no follow up questions or context queries. | grimulkan/passkey-retrieval | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T03:51:10+00:00 | {"license": "apache-2.0"} | 2024-01-13T04:02:38+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
Passkey retrieval training/evaluation data in Fastchat format. You will have to split into train/evaluation manually.
- Articles were drawn from Long C4 in varying lengths
- A secret passkey was inserted somewhere in the article, randomly.
- The name and type of secret is randomly varied (passphrase, secret key, specific fact, favorite colors, password, etc.) and the passkey itself was randomly generated based on various proper nouns (Faker Library), words/phrases of varying lengths (WonderWords Library), etc.
- A note to remember the passkey/fact above was added with 50% probability.
- With 15% probability, there was no passkey/fact included and the response indicates that no such information exists.
There are a number of files in the format: 'c4_passkey_XXYY.json'.
'XX' is the approximate length of the input prompt in ChatGPT 'tiktoken' tokens. It is very approximate, and may translate to different numbers for Llama. Approx. context lengths of 8K, 10K, 16K and 24K are available (24K roughly corresponds to 30K Llama2 tokens).
If 'YY' is blank, it includes not just a query for the passkey/fact, but also some follow-up multi-round questions about the surrounding context, which line it is present in, etc.
If 'YY' is '_nocontext', then it is purely a single Q and A, with no follow up questions or context queries. | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
618acbc7678160b43063c1a491a65bd58422c1b5 |
# Dataset of jaguar_warrior/ジャガーマン/豹人 (Fate/Grand Order)
This is the dataset of jaguar_warrior/ジャガーマン/豹人 (Fate/Grand Order), containing 28 images and their tags.
The core tags of this character are `short_hair, animal_ears, orange_hair, brown_eyes, tail, fang, brown_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 33.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jaguar_warrior_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 18.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jaguar_warrior_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 40.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jaguar_warrior_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 29.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jaguar_warrior_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 56.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jaguar_warrior_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jaguar_warrior_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | looking_at_viewer, solo, 1girl, open_mouth, orange_eyes, smile, blush, skin_fang, tiger_ears, collarbone, holding, hoodie, large_breasts, navel, simple_background, standing_on_one_leg, tiger_tail, upper_body, white_background |
| 1 | 8 |  |  |  |  |  | solo, holding, open_mouth, 1boy, looking_at_viewer, male_focus, tiger_print, 1girl, animal_costume, full_body, hood, staff, blonde_hair, orange_eyes, shoes, smile, striped, weapon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | solo | 1girl | open_mouth | orange_eyes | smile | blush | skin_fang | tiger_ears | collarbone | holding | hoodie | large_breasts | navel | simple_background | standing_on_one_leg | tiger_tail | upper_body | white_background | 1boy | male_focus | tiger_print | animal_costume | full_body | hood | staff | blonde_hair | shoes | striped | weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:-------|:--------|:-------------|:--------------|:--------|:--------|:------------|:-------------|:-------------|:----------|:---------|:----------------|:--------|:--------------------|:----------------------|:-------------|:-------------|:-------------------|:-------|:-------------|:--------------|:-----------------|:------------|:-------|:--------|:--------------|:--------|:----------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/jaguar_warrior_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T03:52:13+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T04:01:46+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of jaguar\_warrior/ジャガーマン/豹人 (Fate/Grand Order)
=======================================================
This is the dataset of jaguar\_warrior/ジャガーマン/豹人 (Fate/Grand Order), containing 28 images and their tags.
The core tags of this character are 'short\_hair, animal\_ears, orange\_hair, brown\_eyes, tail, fang, brown\_hair, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
c8744e2827c7ed1273619ae33750afc99a36694c |
Multi-round questions and answers for randomly selected Wikipedia articles of varying lengths, in fastchat JSON format, generated by `gpt-4-1106-preview`. OpenAI terms apply.
This was designed to train a 32K context-length model. Check the total conversation lengths before using data items for training to ensure that they fit inside your target context window, and discard queries that don't fit.
- Both the questions and answers were generated by GPT4, based on the document. Only information from the included document in the first prompt was considered (and this was verified using GPT4).
- With 25% probability, questions that do not have an answer in the document were asked, to discourage hallucinations.
- With 15% probability, the raw article/document was provided followed by a question. Otherwise, some background about the task at hand was included.
- Articles were augmented in varivarious random ways (sub-headings removed, bullets removed, citations/background removed, etc.)
Only 60 entries are included but they are long and multi-round (whatever I could fit in a budget of ~$1000 in API calls). | grimulkan/wikipedia-document-question-answer | [
"license:unknown",
"region:us"
] | 2024-01-13T04:04:13+00:00 | {"license": "unknown"} | 2024-01-13T04:10:43+00:00 | [] | [] | TAGS
#license-unknown #region-us
|
Multi-round questions and answers for randomly selected Wikipedia articles of varying lengths, in fastchat JSON format, generated by 'gpt-4-1106-preview'. OpenAI terms apply.
This was designed to train a 32K context-length model. Check the total conversation lengths before using data items for training to ensure that they fit inside your target context window, and discard queries that don't fit.
- Both the questions and answers were generated by GPT4, based on the document. Only information from the included document in the first prompt was considered (and this was verified using GPT4).
- With 25% probability, questions that do not have an answer in the document were asked, to discourage hallucinations.
- With 15% probability, the raw article/document was provided followed by a question. Otherwise, some background about the task at hand was included.
- Articles were augmented in varivarious random ways (sub-headings removed, bullets removed, citations/background removed, etc.)
Only 60 entries are included but they are long and multi-round (whatever I could fit in a budget of ~$1000 in API calls). | [] | [
"TAGS\n#license-unknown #region-us \n"
] |
c446c0e240f4893f0c318056dbb95bb65d079f6a |
# Dataset of hornet/ホーネット/大黄蜂 (Azur Lane)
This is the dataset of hornet/ホーネット/大黄蜂 (Azur Lane), containing 148 images and their tags.
The core tags of this character are `blonde_hair, long_hair, green_eyes, twintails, breasts, large_breasts, bangs, sidelocks, very_long_hair, hat, cowboy_hat, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 148 | 205.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 148 | 116.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 361 | 244.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 148 | 178.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 361 | 340.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hornet_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, bikini_top_only, black_bikini, black_shorts, cleavage, cowboy_shot, midriff, short_shorts, solo, looking_at_viewer, navel, blush, cutoffs, groin, standing, black_sleeves, collarbone, front-tie_bikini_top, white_background, yellow_belt, simple_background, stomach, grin, black_coat, cape, detached_sleeves, teeth, hair_between_eyes, long_sleeves, thighhighs |
| 1 | 5 |  |  |  |  |  | 1girl, bikini_top_only, black_bikini, black_shorts, cape, front-tie_bikini_top, looking_at_viewer, medium_breasts, midriff, short_shorts, solo, black_sleeves, cleavage, yellow_belt, black_thighhighs, character_name, cutoffs, navel, simple_background, black_footwear, cowboy_shot, grin, hand_on_headwear, thigh_boots, white_background, white_thighhighs |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_jacket, cleavage, collarbone, denim_shorts, looking_at_viewer, midriff, navel, off_shoulder, short_shorts, solo, black_ribbon, black_shorts, black_thighhighs, grin, hair_ribbon, long_sleeves, necklace, open_jacket, belt, black_coat, black_footwear, crop_top, cutoffs, full_body, simple_background, sneakers, standing, thighs, white_background, ahoge, cowboy_shot, dog_tags, open_coat, shirt, sleeveless, tank_top |
| 3 | 10 |  |  |  |  |  | 1girl, cleavage, midriff, solo, black_gloves, hair_ribbon, looking_at_viewer, black_ribbon, fingerless_gloves, bikini_top_only, black_bikini, long_sleeves, open_jacket, pizza_slice, belt, blush, short_shorts, smile, black_thighhighs, cropped_jacket, eating, holding_food, navel, open_mouth, red_jacket, sitting, blue_shorts, denim_shorts, groin, official_alternate_costume, red_scarf, simple_background, standing, white_background |
| 4 | 10 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, elbow_gloves, black_dress, hair_ribbon, looking_at_viewer, smile, solo, belt, black_ribbon, black_shorts, black_thighhighs, cleavage, short_shorts, navel, necklace, blush, thigh_boots, ahoge, full_body, choker, collarbone, drinking_glass, holding_cup, one_eye_closed, open_mouth, white_footwear |
| 5 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, one_eye_closed, cleavage, grin, black_choker, blush, earrings, necklace, official_alternate_costume, open_mouth, swimsuit, upper_body |
| 6 | 5 |  |  |  |  |  | 1girl, cheerleader, hair_ribbon, holding, looking_at_viewer, navel, pom_pom_(cheerleading), solo, bare_shoulders, blush, crop_top, midriff, miniskirt, pleated_skirt, stomach, white_footwear, black_socks, collarbone, confetti, grin, kneehighs, orange_skirt, sleeveless_shirt, sneakers, underboob, arm_up, armpits, black_ribbon, breasts_apart, cowboy_shot, full_body, jewelry, jumping, nipples, outdoors, parted_bangs, sky, standing_on_one_leg, suspender_skirt, sweat, teeth, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bikini_top_only | black_bikini | black_shorts | cleavage | cowboy_shot | midriff | short_shorts | solo | looking_at_viewer | navel | blush | cutoffs | groin | standing | black_sleeves | collarbone | front-tie_bikini_top | white_background | yellow_belt | simple_background | stomach | grin | black_coat | cape | detached_sleeves | teeth | hair_between_eyes | long_sleeves | thighhighs | medium_breasts | black_thighhighs | character_name | black_footwear | hand_on_headwear | thigh_boots | white_thighhighs | bare_shoulders | black_jacket | denim_shorts | off_shoulder | black_ribbon | hair_ribbon | necklace | open_jacket | belt | crop_top | full_body | sneakers | thighs | ahoge | dog_tags | open_coat | shirt | sleeveless | tank_top | black_gloves | fingerless_gloves | pizza_slice | smile | cropped_jacket | eating | holding_food | open_mouth | red_jacket | sitting | blue_shorts | official_alternate_costume | red_scarf | elbow_gloves | black_dress | choker | drinking_glass | holding_cup | one_eye_closed | white_footwear | black_choker | earrings | swimsuit | upper_body | cheerleader | holding | pom_pom_(cheerleading) | miniskirt | pleated_skirt | black_socks | confetti | kneehighs | orange_skirt | sleeveless_shirt | underboob | arm_up | armpits | breasts_apart | jewelry | jumping | nipples | outdoors | parted_bangs | sky | standing_on_one_leg | suspender_skirt | sweat | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:---------------|:---------------|:-----------|:--------------|:----------|:---------------|:-------|:--------------------|:--------|:--------|:----------|:--------|:-----------|:----------------|:-------------|:-----------------------|:-------------------|:--------------|:--------------------|:----------|:-------|:-------------|:-------|:-------------------|:--------|:--------------------|:---------------|:-------------|:-----------------|:-------------------|:-----------------|:-----------------|:-------------------|:--------------|:-------------------|:-----------------|:---------------|:---------------|:---------------|:---------------|:--------------|:-----------|:--------------|:-------|:-----------|:------------|:-----------|:---------|:--------|:-----------|:------------|:--------|:-------------|:-----------|:---------------|:--------------------|:--------------|:--------|:-----------------|:---------|:---------------|:-------------|:-------------|:----------|:--------------|:-----------------------------|:------------|:---------------|:--------------|:---------|:-----------------|:--------------|:-----------------|:-----------------|:---------------|:-----------|:-----------|:-------------|:--------------|:----------|:-------------------------|:------------|:----------------|:--------------|:-----------|:------------|:---------------|:-------------------|:------------|:---------|:----------|:----------------|:----------|:----------|:----------|:-----------|:---------------|:------|:----------------------|:------------------|:--------|:--------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | | | X | | X | X | X | X | | X | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | X | X | X | X | X | X | X | X | | X | | X | | X | | X | | X | | X | X | | | | | X | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | | X | | X | X | X | X | X | X | | X | X | | | | X | | X | | | | | | | | X | | | X | | | | | | | | X | | X | X | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | | X | X | | | X | X | X | X | X | | | | | X | | | | | | | | | | | | | | | X | | | | X | | X | | | | X | X | X | | X | | X | | | X | | | | | | X | | | X | | | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | X | | | | X | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | X | X | | X | X | X | X | | | | | X | | | | | X | X | | | | X | | | | | | | | | | | X | | | | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hornet_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T04:18:49+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T05:00:50+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of hornet/ホーネット/大黄蜂 (Azur Lane)
=======================================
This is the dataset of hornet/ホーネット/大黄蜂 (Azur Lane), containing 148 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, green\_eyes, twintails, breasts, large\_breasts, bangs, sidelocks, very\_long\_hair, hat, cowboy\_hat, black\_headwear', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
4279ca51ba534c793cc3cca8f525723fe6da020b |
# Dataset of mecha_eli_chan_mk_ii/メカエリチャンII号機/机械伊丽亲Ⅱ号机 (Fate/Grand Order)
This is the dataset of mecha_eli_chan_mk_ii/メカエリチャンII号機/机械伊丽亲Ⅱ号机 (Fate/Grand Order), containing 11 images and their tags.
The core tags of this character are `blue_eyes, horns, pink_hair, long_hair, pointy_ears, tail, dragon_horns, dragon_tail, bangs, breasts, dragon_girl, fang, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 10.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mecha_eli_chan_mk_ii_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 9.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mecha_eli_chan_mk_ii_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 18 | 14.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mecha_eli_chan_mk_ii_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 10.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mecha_eli_chan_mk_ii_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 18 | 15.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mecha_eli_chan_mk_ii_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mecha_eli_chan_mk_ii_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, dress, open_mouth, smile, detached_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | dress | open_mouth | smile | detached_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:-------------|:--------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X |
| CyberHarem/mecha_eli_chan_mk_ii_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T05:12:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T05:15:07+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of mecha\_eli\_chan\_mk\_ii/メカエリチャンII号機/机械伊丽亲Ⅱ号机 (Fate/Grand Order)
===========================================================================
This is the dataset of mecha\_eli\_chan\_mk\_ii/メカエリチャンII号機/机械伊丽亲Ⅱ号机 (Fate/Grand Order), containing 11 images and their tags.
The core tags of this character are 'blue\_eyes, horns, pink\_hair, long\_hair, pointy\_ears, tail, dragon\_horns, dragon\_tail, bangs, breasts, dragon\_girl, fang, small\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
da85d01c1d81e9ef418697b94e041b3201e7ac2f |
# Dataset of saika_magoichi/雑賀孫一/杂贺孙一 (Fate/Grand Order)
This is the dataset of saika_magoichi/雑賀孫一/杂贺孙一 (Fate/Grand Order), containing 31 images and their tags.
The core tags of this character are `long_hair, bangs, hair_ornament, blue_eyes, hair_between_eyes, white_hair, feather_hair_ornament, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 55.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saika_magoichi_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 25.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saika_magoichi_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 69 | 52.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saika_magoichi_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 46.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saika_magoichi_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 69 | 88.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saika_magoichi_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/saika_magoichi_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, looking_at_viewer, feathers, cape, upper_body, closed_mouth, simple_background, white_background, gloves, holding_gun |
| 1 | 7 |  |  |  |  |  | 1girl, eating, fingerless_gloves, food_on_face, holding_food, looking_at_viewer, onigiri, solo, black_gloves, feathers, upper_body, :t, blush, simple_background, white_background, closed_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | feathers | cape | upper_body | closed_mouth | simple_background | white_background | gloves | holding_gun | eating | fingerless_gloves | food_on_face | holding_food | onigiri | black_gloves | :t | blush |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------|:-------|:-------------|:---------------|:--------------------|:-------------------|:---------|:--------------|:---------|:--------------------|:---------------|:---------------|:----------|:---------------|:-----|:--------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | X | X | X | X | | | X | X | X | X | X | X | X | X |
| CyberHarem/saika_magoichi_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T05:12:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T05:19:13+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of saika\_magoichi/雑賀孫一/杂贺孙一 (Fate/Grand Order)
=======================================================
This is the dataset of saika\_magoichi/雑賀孫一/杂贺孙一 (Fate/Grand Order), containing 31 images and their tags.
The core tags of this character are 'long\_hair, bangs, hair\_ornament, blue\_eyes, hair\_between\_eyes, white\_hair, feather\_hair\_ornament, grey\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
f3bbf9fa8f196398b56e945fc098fb8bf89f1ae7 |
# Dataset of daikokuten/大黒天/大黑天 (Fate/Grand Order)
This is the dataset of daikokuten/大黒天/大黑天 (Fate/Grand Order), containing 53 images and their tags.
The core tags of this character are `animal_ears, mouse_ears, dark-skinned_female, dark_skin, mouse_girl, red_eyes, bangs, white_hair, short_hair, mouse_tail, tail, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 53 | 44.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daikokuten_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 53 | 33.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daikokuten_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 106 | 62.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daikokuten_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 53 | 41.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daikokuten_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 106 | 77.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daikokuten_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/daikokuten_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | solo, 1girl, apron, smile, looking_at_viewer, maid_headdress, open_mouth, blush_stickers, long_sleeves, simple_background, white_background, pink_hair, ribbon |
| 1 | 9 |  |  |  |  |  | apron, nurse_cap, armband, 1girl, dress, solo, hairclip, orange_eyes, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | solo | 1girl | apron | smile | looking_at_viewer | maid_headdress | open_mouth | blush_stickers | long_sleeves | simple_background | white_background | pink_hair | ribbon | nurse_cap | armband | dress | hairclip | orange_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:--------|:--------|:--------------------|:-----------------|:-------------|:-----------------|:---------------|:--------------------|:-------------------|:------------|:---------|:------------|:----------|:--------|:-----------|:--------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | | | | | | | | | | X | X | X | X | X |
| CyberHarem/daikokuten_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T05:12:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T05:22:18+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of daikokuten/大黒天/大黑天 (Fate/Grand Order)
================================================
This is the dataset of daikokuten/大黒天/大黑天 (Fate/Grand Order), containing 53 images and their tags.
The core tags of this character are 'animal\_ears, mouse\_ears, dark-skinned\_female, dark\_skin, mouse\_girl, red\_eyes, bangs, white\_hair, short\_hair, mouse\_tail, tail, bow', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
133c7ee0fd7c6ae7c5612a9096cad8f334067780 |
# Dataset of suzuka_gozen_summevaca/鈴鹿御前〔サマバケ〕/铃鹿御前〔暑假〕 (Fate/Grand Order)
This is the dataset of suzuka_gozen_summevaca/鈴鹿御前〔サマバケ〕/铃鹿御前〔暑假〕 (Fate/Grand Order), containing 342 images and their tags.
The core tags of this character are `animal_ears, long_hair, animal_ear_fluff, breasts, fox_ears, yellow_eyes, large_breasts, tail, blonde_hair, fox_tail, bangs, fox_girl, dark_skin, dark-skinned_female`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 342 | 537.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzuka_gozen_summevaca_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 342 | 291.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzuka_gozen_summevaca_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 870 | 648.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzuka_gozen_summevaca_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 342 | 467.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzuka_gozen_summevaca_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 870 | 944.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzuka_gozen_summevaca_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/suzuka_gozen_summevaca_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, cleavage, eyewear_on_head, looking_at_viewer, multicolored_hair, necklace, pink_hair, solo, sunglasses, bracelet, gyaru, leopard_print, navel, pink_bikini, tan, grin, heart |
| 1 | 5 |  |  |  |  |  | 1girl, eyewear_on_head, navel, solo, sunglasses, thighs, bare_shoulders, cleavage, collarbone, looking_at_viewer, purple_bikini, smartphone, wrist_scrunchie, grin, necklace, open_clothes, blue_bikini, o-ring, white_jacket |
| 2 | 9 |  |  |  |  |  | 1girl, holding_sword, katana, looking_at_viewer, smile, solo, short_sleeves, hakama_short_skirt, red_hakama, red_skirt |
| 3 | 5 |  |  |  |  |  | 1girl, holding_sword, katana, looking_at_viewer, solo, tate_eboshi, wide_sleeves, open_mouth, :d, bag, hakama, red_skirt, ribbon-trimmed_sleeves, simple_background, white_background |
| 4 | 61 |  |  |  |  |  | 1girl, gyaru, santa_costume, santa_hat, fur_trim, smile, tan, choker, detached_sleeves, looking_at_viewer, bare_shoulders, neck_bell, cleavage, solo, wide_sleeves, energy_wings, collarbone, blush, open_mouth, navel, thighs, skirt, one_eye_closed |
| 5 | 9 |  |  |  |  |  | 1boy, 1girl, blush, breasts_squeezed_together, hetero, paizuri, smile, solo_focus, cum_on_breasts, orange_hair, short_sleeves, collarbone, nipples, open_shirt, heart, looking_at_viewer, motion_lines, penis, pov, censored, ejaculation, open_mouth, skirt, sweat |
| 6 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, navel, nipples, sex, vaginal, completely_nude, mosaic_censoring, multicolored_hair, open_mouth, penis, pink_hair, solo_focus, sweat, looking_at_viewer, smile, cowgirl_position, cum_in_pussy, drill_hair, girl_on_top, necklace, spread_legs |
| 7 | 6 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, smile, solo, eyewear_on_head, sunglasses, thighhighs, blush, pink_gloves, race_queen, clothing_cutout, highleg, pink_leotard, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | eyewear_on_head | looking_at_viewer | multicolored_hair | necklace | pink_hair | solo | sunglasses | bracelet | gyaru | leopard_print | navel | pink_bikini | tan | grin | heart | thighs | bare_shoulders | collarbone | purple_bikini | smartphone | wrist_scrunchie | open_clothes | blue_bikini | o-ring | white_jacket | holding_sword | katana | smile | short_sleeves | hakama_short_skirt | red_hakama | red_skirt | tate_eboshi | wide_sleeves | open_mouth | :d | bag | hakama | ribbon-trimmed_sleeves | simple_background | white_background | santa_costume | santa_hat | fur_trim | choker | detached_sleeves | neck_bell | energy_wings | blush | skirt | one_eye_closed | 1boy | breasts_squeezed_together | hetero | paizuri | solo_focus | cum_on_breasts | orange_hair | nipples | open_shirt | motion_lines | penis | pov | censored | ejaculation | sweat | sex | vaginal | completely_nude | mosaic_censoring | cowgirl_position | cum_in_pussy | drill_hair | girl_on_top | spread_legs | elbow_gloves | thighhighs | pink_gloves | race_queen | clothing_cutout | highleg | pink_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:------------------|:--------------------|:--------------------|:-----------|:------------|:-------|:-------------|:-----------|:--------|:----------------|:--------|:--------------|:------|:-------|:--------|:---------|:-----------------|:-------------|:----------------|:-------------|:------------------|:---------------|:--------------|:---------|:---------------|:----------------|:---------|:--------|:----------------|:---------------------|:-------------|:------------|:--------------|:---------------|:-------------|:-----|:------|:---------|:-------------------------|:--------------------|:-------------------|:----------------|:------------|:-----------|:---------|:-------------------|:------------|:---------------|:--------|:--------|:-----------------|:-------|:----------------------------|:---------|:----------|:-------------|:-----------------|:--------------|:----------|:-------------|:---------------|:--------|:------|:-----------|:--------------|:--------|:------|:----------|:------------------|:-------------------|:-------------------|:---------------|:-------------|:--------------|:--------------|:---------------|:-------------|:--------------|:-------------|:------------------|:----------|:---------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | X | | X | X | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 61 |  |  |  |  |  | X | X | | X | | | | X | | | X | | X | | X | | | X | X | X | | | | | | | | | | X | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | X | | | X | | | | | | | | | | X | X | | | | | | X | | | | | | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | X | | | X | | X | | X | | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | X | X | | | | X | X | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
| CyberHarem/suzuka_gozen_summevaca_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T05:13:36+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T06:27:07+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of suzuka\_gozen\_summevaca/鈴鹿御前〔サマバケ〕/铃鹿御前〔暑假〕 (Fate/Grand Order)
==========================================================================
This is the dataset of suzuka\_gozen\_summevaca/鈴鹿御前〔サマバケ〕/铃鹿御前〔暑假〕 (Fate/Grand Order), containing 342 images and their tags.
The core tags of this character are 'animal\_ears, long\_hair, animal\_ear\_fluff, breasts, fox\_ears, yellow\_eyes, large\_breasts, tail, blonde\_hair, fox\_tail, bangs, fox\_girl, dark\_skin, dark-skinned\_female', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b6fcda961256b66e532f5087c071b354ee41a2d8 |
# Dataset of jiuwenlong_elisa/九紋竜エリザ/九纹龙伊丽莎 (Fate/Grand Order)
This is the dataset of jiuwenlong_elisa/九紋竜エリザ/九纹龙伊丽莎 (Fate/Grand Order), containing 21 images and their tags.
The core tags of this character are `horns, pink_hair, pointy_ears, dragon_horns, dragon_girl, blue_eyes, long_hair, tail, fangs, bow, dragon_tail, bangs, pink_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 26.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jiuwenlong_elisa_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 14.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jiuwenlong_elisa_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 27.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jiuwenlong_elisa_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 23.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jiuwenlong_elisa_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 40.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jiuwenlong_elisa_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jiuwenlong_elisa_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, detached_sleeves, smile, solo, open_mouth, thighhighs, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | smile | solo | open_mouth | thighhighs | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------|:-------|:-------------|:-------------|:--------------------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X |
| CyberHarem/jiuwenlong_elisa_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T05:15:06+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T05:19:23+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of jiuwenlong\_elisa/九紋竜エリザ/九纹龙伊丽莎 (Fate/Grand Order)
=============================================================
This is the dataset of jiuwenlong\_elisa/九紋竜エリザ/九纹龙伊丽莎 (Fate/Grand Order), containing 21 images and their tags.
The core tags of this character are 'horns, pink\_hair, pointy\_ears, dragon\_horns, dragon\_girl, blue\_eyes, long\_hair, tail, fangs, bow, dragon\_tail, bangs, pink\_bow', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
440b308dd1068610f4da28b44fdbc3022eea654d |
# Dataset of gascogne/ガスコーニュ(μ兵装)/加斯科涅(μ兵装) (Azur Lane)
This is the dataset of gascogne/ガスコーニュ(μ兵装)/加斯科涅(μ兵装) (Azur Lane), containing 213 images and their tags.
The core tags of this character are `blue_hair, short_hair, yellow_eyes, headgear, breasts, bangs, medium_breasts, multicolored_hair, streaked_hair, mechanical_halo, halo`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 213 | 320.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gascogne_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 213 | 174.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gascogne_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 532 | 380.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gascogne_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 213 | 281.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gascogne_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 532 | 543.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gascogne_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gascogne_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, gauntlets, solo, strapless_dress, white_dress, armored_boots, bare_shoulders, looking_at_viewer, black_thighhighs, holding_weapon, simple_background, cross_necklace |
| 1 | 23 |  |  |  |  |  | 1girl, bare_shoulders, solo, cross, gauntlets, strapless_dress, white_dress, looking_at_viewer, choker, simple_background, white_background |
| 2 | 20 |  |  |  |  |  | 1girl, white_bikini, cross_necklace, hair_flower, solo, cleavage, looking_at_viewer, choker, wrist_scrunchie, black_bikini, navel, white_background, simple_background, white_flower |
| 3 | 24 |  |  |  |  |  | bare_shoulders, sleeveless_shirt, white_shirt, 1girl, looking_at_viewer, black_skirt, solo, pleated_skirt, holding_microphone, black_thighhighs, miniskirt, cross_necklace, +_+, collared_shirt, navel, standing, black_belt, fingerless_gloves, cowboy_shot, idol, zettai_ryouiki, detached_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gauntlets | solo | strapless_dress | white_dress | armored_boots | bare_shoulders | looking_at_viewer | black_thighhighs | holding_weapon | simple_background | cross_necklace | cross | choker | white_background | white_bikini | hair_flower | cleavage | wrist_scrunchie | black_bikini | navel | white_flower | sleeveless_shirt | white_shirt | black_skirt | pleated_skirt | holding_microphone | miniskirt | +_+ | collared_shirt | standing | black_belt | fingerless_gloves | cowboy_shot | idol | zettai_ryouiki | detached_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-------|:------------------|:--------------|:----------------|:-----------------|:--------------------|:-------------------|:-----------------|:--------------------|:-----------------|:--------|:---------|:-------------------|:---------------|:--------------|:-----------|:------------------|:---------------|:--------|:---------------|:-------------------|:--------------|:--------------|:----------------|:---------------------|:------------|:------|:-----------------|:-----------|:-------------|:--------------------|:--------------|:-------|:-----------------|:-------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 20 |  |  |  |  |  | X | | X | | | | | X | | | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 3 | 24 |  |  |  |  |  | X | | X | | | | X | X | X | | | X | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/gascogne_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T05:17:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T06:07:45+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of gascogne/ガスコーニュ(μ兵装)/加斯科涅(μ兵装) (Azur Lane)
=====================================================
This is the dataset of gascogne/ガスコーニュ(μ兵装)/加斯科涅(μ兵装) (Azur Lane), containing 213 images and their tags.
The core tags of this character are 'blue\_hair, short\_hair, yellow\_eyes, headgear, breasts, bangs, medium\_breasts, multicolored\_hair, streaked\_hair, mechanical\_halo, halo', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7021c16d8902deef3c0e5ff7c0a9d1dc395cd23e | # Dataset Card for "cpm_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jay401521/cpm_train | [
"region:us"
] | 2024-01-13T06:17:37+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "options", "struct": [{"name": "<option_0>", "dtype": "string"}, {"name": "<option_1>", "dtype": "string"}]}, {"name": "<ans>", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23437850, "num_examples": 99988}], "download_size": 13297997, "dataset_size": 23437850}} | 2024-01-13T06:17:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cpm_train"
More Information needed | [
"# Dataset Card for \"cpm_train\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cpm_train\"\n\nMore Information needed"
] |
e46d163a5864d8dc4fea020c45f2feada5e0684a |
# Dataset of mary_anning/メアリー・アニング/玛丽·安宁 (Fate/Grand Order)
This is the dataset of mary_anning/メアリー・アニング/玛丽·安宁 (Fate/Grand Order), containing 25 images and their tags.
The core tags of this character are `brown_hair, yellow_eyes, long_hair, hat, horns, braid, bow, blue_bow, hair_bow, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 26.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_anning_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 16.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_anning_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 54 | 32.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_anning_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 24.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_anning_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 54 | 44.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_anning_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mary_anning_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, solo, yellow_scarf, long_sleeves, looking_at_viewer, skirt, closed_mouth, shirt, smile, jacket, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | yellow_scarf | long_sleeves | looking_at_viewer | skirt | closed_mouth | shirt | smile | jacket | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:---------------|:--------------------|:--------|:---------------|:--------|:--------|:---------|:--------------------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/mary_anning_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T06:23:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T06:29:34+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of mary\_anning/メアリー・アニング/玛丽·安宁 (Fate/Grand Order)
==========================================================
This is the dataset of mary\_anning/メアリー・アニング/玛丽·安宁 (Fate/Grand Order), containing 25 images and their tags.
The core tags of this character are 'brown\_hair, yellow\_eyes, long\_hair, hat, horns, braid, bow, blue\_bow, hair\_bow, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
942e21f8c35af45a2472395758793e01abccf5ad |
# Dataset of elisabeth_bathory_cinderella/エリザベート・バートリー〔シンデレラ〕/伊丽莎白·巴托里〔灰姑娘〕 (Fate/Grand Order)
This is the dataset of elisabeth_bathory_cinderella/エリザベート・バートリー〔シンデレラ〕/伊丽莎白·巴托里〔灰姑娘〕 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `pink_hair, long_hair, blue_eyes, pointy_ears, horns, tail, dragon_horns, bangs, dragon_tail, curled_horns, dragon_girl, ribbon, two_side_up, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 768.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elisabeth_bathory_cinderella_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 429.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elisabeth_bathory_cinderella_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1275 | 942.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elisabeth_bathory_cinderella_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 676.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elisabeth_bathory_cinderella_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1275 | 1.31 GiB | [Download](https://huggingface.co/datasets/CyberHarem/elisabeth_bathory_cinderella_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elisabeth_bathory_cinderella_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, detached_sleeves, looking_at_viewer, open_mouth, smile, solo, ;d, blush, holding_microphone, one_eye_closed, plaid_skirt, fang, microphone_stand, corset, heart, bare_shoulders, small_breasts, tail_bow |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, blush, corset, detached_sleeves, looking_at_viewer, open_mouth, plaid_skirt, solo, :d, fang, hair_ribbon, simple_background, white_background, long_sleeves, small_breasts |
| 2 | 5 |  |  |  |  |  | 1girl, detached_sleeves, looking_at_viewer, smile, solo, black_dress, blush, closed_mouth |
| 3 | 14 |  |  |  |  |  | 1girl, detached_sleeves, dress_flower, hat_flower, looking_at_viewer, solo, striped_headwear, top_hat, vertical-striped_dress, frilled_dress, holding_microphone, hair_between_eyes, pink_dress, pink_headwear, pink_rose, blush, microphone_stand, pig, sleeveless_dress, layered_dress, long_sleeves, open_mouth, fang, simple_background, squirrel, white_background, :d, earrings, polka_dot_dress, wrist_cuffs, animal, closed_mouth, stuffed_toy, v-shaped_eyebrows, very_long_hair |
| 4 | 14 |  |  |  |  |  | 1girl, solo, witch_hat, looking_at_viewer, jack-o'-lantern, detached_sleeves, vertical-striped_dress, open_mouth, choker, earrings, pumpkin, bat_wings, black_thighhighs, fang, :d, demon_tail, halloween_costume, star_print, blush, holding, horns_through_headwear, polearm |
| 5 | 6 |  |  |  |  |  | 1girl, blush, collarbone, looking_at_viewer, solo, white_bikini, navel, frilled_bikini, hair_ribbon, simple_background, small_breasts, white_background, ;d, fang, official_alternate_costume, one_eye_closed, open_mouth, smile |
| 6 | 5 |  |  |  |  |  | 1girl, bikini_armor, black_thighhighs, blush, hair_ribbon, navel, oversized_clothes, red_bikini, small_breasts, solo, tiara, vambraces, white_background, hair_between_eyes, looking_at_viewer, pauldrons, red_armor, silver_trim, simple_background, white_cape, closed_mouth, purple_ribbon, armored_boots, bare_shoulders, blue_ribbon, gauntlets, gloves, groin, holding, red_choker, very_long_hair |
| 7 | 8 |  |  |  |  |  | 1girl, bikini_armor, black_thighhighs, looking_at_viewer, pauldrons, silver_trim, small_breasts, smile, solo, holding_sword, red_armor, red_bikini, tiara, navel, simple_background, white_background, blush, choker, armored_boots, fang, gauntlets, white_cape, closed_mouth, full_body, knee_boots, red_footwear, vambraces |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | looking_at_viewer | open_mouth | smile | solo | ;d | blush | holding_microphone | one_eye_closed | plaid_skirt | fang | microphone_stand | corset | heart | bare_shoulders | small_breasts | tail_bow | :d | hair_ribbon | simple_background | white_background | long_sleeves | black_dress | closed_mouth | dress_flower | hat_flower | striped_headwear | top_hat | vertical-striped_dress | frilled_dress | hair_between_eyes | pink_dress | pink_headwear | pink_rose | pig | sleeveless_dress | layered_dress | squirrel | earrings | polka_dot_dress | wrist_cuffs | animal | stuffed_toy | v-shaped_eyebrows | very_long_hair | witch_hat | jack-o'-lantern | choker | pumpkin | bat_wings | black_thighhighs | demon_tail | halloween_costume | star_print | holding | horns_through_headwear | polearm | collarbone | white_bikini | navel | frilled_bikini | official_alternate_costume | bikini_armor | oversized_clothes | red_bikini | tiara | vambraces | pauldrons | red_armor | silver_trim | white_cape | purple_ribbon | armored_boots | blue_ribbon | gauntlets | gloves | groin | red_choker | holding_sword | full_body | knee_boots | red_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------------------|:-------------|:--------|:-------|:-----|:--------|:---------------------|:-----------------|:--------------|:-------|:-------------------|:---------|:--------|:-----------------|:----------------|:-----------|:-----|:--------------|:--------------------|:-------------------|:---------------|:--------------|:---------------|:---------------|:-------------|:-------------------|:----------|:-------------------------|:----------------|:--------------------|:-------------|:----------------|:------------|:------|:-------------------|:----------------|:-----------|:-----------|:------------------|:--------------|:---------|:--------------|:--------------------|:-----------------|:------------|:------------------|:---------|:----------|:------------|:-------------------|:-------------|:--------------------|:-------------|:----------|:-------------------------|:----------|:-------------|:---------------|:--------|:-----------------|:-----------------------------|:---------------|:--------------------|:-------------|:--------|:------------|:------------|:------------|:--------------|:-------------|:----------------|:----------------|:--------------|:------------|:---------|:--------|:-------------|:----------------|:------------|:-------------|:---------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | X | | X | | | X | X | | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | X | | X | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | X | | X | | X | X | | | X | X | | | | | | X | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 14 |  |  |  |  |  | X | X | X | X | | X | | X | | | | X | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | X | X | X | X | X | | X | | X | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | | | X | | X | | | | | | | | X | X | | | X | X | X | | | X | | | | | | | X | | | | | | | | | | | | | | X | | | | | | X | | | | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 7 | 8 |  |  |  |  |  | X | | X | | X | X | | X | | | | X | | | | | X | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | X | | | X | | X | X | X | X | X | X | X | | X | | X | | | | X | X | X | X |
| CyberHarem/elisabeth_bathory_cinderella_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T06:23:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T08:17:17+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of elisabeth\_bathory\_cinderella/エリザベート・バートリー〔シンデレラ〕/伊丽莎白·巴托里〔灰姑娘〕 (Fate/Grand Order)
==============================================================================================
This is the dataset of elisabeth\_bathory\_cinderella/エリザベート・バートリー〔シンデレラ〕/伊丽莎白·巴托里〔灰姑娘〕 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are 'pink\_hair, long\_hair, blue\_eyes, pointy\_ears, horns, tail, dragon\_horns, bangs, dragon\_tail, curled\_horns, dragon\_girl, ribbon, two\_side\_up, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
caf7851fa50cabdd5378bb6f585a105bb90e7c75 |
# Dataset of kafka/カフカ/卡芙卡/카프카 (Honkai: Star Rail)
This is the dataset of kafka/カフカ/卡芙卡/카프카 (Honkai: Star Rail), containing 500 images and their tags.
The core tags of this character are `bangs, breasts, eyewear_on_head, long_hair, sunglasses, purple_hair, purple_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.36 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kafka_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 578.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kafka_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1374 | 1.28 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kafka_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 1.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kafka_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1374 | 2.12 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kafka_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kafka_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, solo, white_shirt, jacket, long_sleeves, smile, looking_at_viewer, purple_gloves, closed_mouth |
| 1 | 6 |  |  |  |  |  | 1girl, black_jacket, long_sleeves, looking_at_viewer, smile, solo, white_shirt, closed_mouth, pink_eyes, pink_hair, black_shorts, pink_gloves, open_clothes |
| 2 | 7 |  |  |  |  |  | 1girl, holding_instrument, long_sleeves, pantyhose, playing_instrument, smile, solo, violin, white_shirt, gloves, looking_at_viewer, closed_mouth, jacket, black_shorts, petals |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_shirt | jacket | long_sleeves | smile | looking_at_viewer | purple_gloves | closed_mouth | black_jacket | pink_eyes | pink_hair | black_shorts | pink_gloves | open_clothes | holding_instrument | pantyhose | playing_instrument | violin | gloves | petals |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:---------|:---------------|:--------|:--------------------|:----------------|:---------------|:---------------|:------------|:------------|:---------------|:--------------|:---------------|:---------------------|:------------|:---------------------|:---------|:---------|:---------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | X | X | X | X | X | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | | | X | | | X | X | X | X | X | X |
| CyberHarem/kafka_starrail | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T06:53:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T09:18:51+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kafka/カフカ/卡芙卡/카프카 (Honkai: Star Rail)
================================================
This is the dataset of kafka/カフカ/卡芙卡/카프카 (Honkai: Star Rail), containing 500 images and their tags.
The core tags of this character are 'bangs, breasts, eyewear\_on\_head, long\_hair, sunglasses, purple\_hair, purple\_eyes, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
262d857f0ecf3578ea4d477156513d960ce258f2 |
# Dataset of fu_xuan/符玄/符玄/부현 (Honkai: Star Rail)
This is the dataset of fu_xuan/符玄/符玄/부현 (Honkai: Star Rail), containing 419 images and their tags.
The core tags of this character are `long_hair, bangs, hair_ornament, pink_hair, parted_bangs, facial_mark, very_long_hair, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 419 | 957.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_xuan_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 419 | 419.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_xuan_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1091 | 935.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_xuan_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 419 | 778.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_xuan_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1091 | 1.45 GiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_xuan_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fu_xuan_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blush, 1boy, hetero, nipples, penis, sex, solo_focus, open_mouth, sweat, vaginal, small_breasts, spread_legs, completely_nude, looking_at_viewer, navel, pink_eyes, collarbone, lying, mosaic_censoring, pov, pussy_juice, bar_censor, jewelry |
| 1 | 14 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, forehead_mark, hair_rings, white_background, simple_background, closed_mouth, parted_lips, purple_hair, white_dress, blush, upper_body |
| 2 | 39 |  |  |  |  |  | 1girl, dress, solo, forehead_mark, looking_at_viewer, bare_shoulders, jewelry, closed_mouth |
| 3 | 13 |  |  |  |  |  | 1girl, bare_shoulders, no_shoes, solo, dress, looking_at_viewer, sitting, toes, white_pantyhose, legs, full_body, soles, forehead_mark, blush, foot_focus, knees_up, purple_hair, hair_rings, indoors |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | 1boy | hetero | nipples | penis | sex | solo_focus | open_mouth | sweat | vaginal | small_breasts | spread_legs | completely_nude | looking_at_viewer | navel | pink_eyes | collarbone | lying | mosaic_censoring | pov | pussy_juice | bar_censor | jewelry | bare_shoulders | solo | forehead_mark | hair_rings | white_background | simple_background | closed_mouth | parted_lips | purple_hair | white_dress | upper_body | dress | no_shoes | sitting | toes | white_pantyhose | legs | full_body | soles | foot_focus | knees_up | indoors |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:---------|:----------|:--------|:------|:-------------|:-------------|:--------|:----------|:----------------|:--------------|:------------------|:--------------------|:--------|:------------|:-------------|:--------|:-------------------|:------|:--------------|:-------------|:----------|:-----------------|:-------|:----------------|:-------------|:-------------------|:--------------------|:---------------|:--------------|:--------------|:--------------|:-------------|:--------|:-----------|:----------|:-------|:------------------|:-------|:------------|:--------|:-------------|:-----------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 39 |  |  |  |  |  | X | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | | | | X | | | | | X | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | X | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/fu_xuan_starrail | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T06:53:31+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T09:04:23+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fu\_xuan/符玄/符玄/부현 (Honkai: Star Rail)
================================================
This is the dataset of fu\_xuan/符玄/符玄/부현 (Honkai: Star Rail), containing 419 images and their tags.
The core tags of this character are 'long\_hair, bangs, hair\_ornament, pink\_hair, parted\_bangs, facial\_mark, very\_long\_hair, yellow\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
3b43c9ca815235d1bd4363eaaa6df2ac29963217 |
# Dataset of jingliu/鏡流/镜流/경류 (Honkai: Star Rail)
This is the dataset of jingliu/鏡流/镜流/경류 (Honkai: Star Rail), containing 500 images and their tags.
The core tags of this character are `long_hair, bangs, breasts, red_eyes, white_hair, hair_between_eyes, very_long_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.20 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 544.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1309 | 1.15 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 995.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1309 | 1.83 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jingliu_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, closed_mouth, looking_at_viewer, solo, black_gloves, detached_sleeves, black_dress, cleavage, ponytail |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, earrings, looking_at_viewer, ponytail, solo, upper_body, closed_mouth, dress, cleavage, detached_sleeves, small_breasts, artist_name, hair_ribbon, medium_breasts |
| 2 | 14 |  |  |  |  |  | 1girl, solo, holding_sword, looking_at_viewer, bare_shoulders, black_gloves, full_moon, night, blue_dress, cleavage, medium_breasts, closed_mouth, grey_hair, ribbon, parted_lips, sky |
| 3 | 5 |  |  |  |  |  | 1girl, bare_shoulders, dress, looking_at_viewer, solo, black_gloves, boots, holding_sword, armor, closed_mouth, parted_lips |
| 4 | 8 |  |  |  |  |  | 1girl, bare_shoulders, solo, blue_dress, looking_at_viewer, medium_breasts, black_gloves, parted_lips, cleavage, bare_legs, barefoot, detached_sleeves, elbow_gloves, feet, full_body, grey_hair, sitting, toes, hair_over_one_eye, jewelry, moon |
| 5 | 5 |  |  |  |  |  | 1girl, black_footwear, knee_boots, looking_at_viewer, solo, bare_shoulders, black_gloves, full_body, high_heel_boots, medium_breasts, sitting, blue_dress, closed_mouth, detached_sleeves, hair_ribbon, simple_background, thighs, white_background, grey_hair, hair_over_one_eye, knee_up, large_breasts, white_skirt |
| 6 | 10 |  |  |  |  |  | 1girl, blush, completely_nude, navel, nipples, solo, closed_mouth, collarbone, large_breasts, blue_hair, hair_ribbon, looking_at_viewer, simple_background, white_background, medium_breasts, armpits, blue_ribbon, pussy |
| 7 | 5 |  |  |  |  |  | 1boy, 1girl, cowgirl_position, hetero, mosaic_censoring, navel, penis, pussy, solo_focus, blush, girl_on_top, large_breasts, nipples, pov, sex, vaginal, grey_hair, open_mouth, bare_shoulders, blindfold, completely_nude, cum, detached_sleeves, earrings, looking_at_viewer, night_sky, ribbon, smile, star_(sky), sweat, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | closed_mouth | looking_at_viewer | solo | black_gloves | detached_sleeves | black_dress | cleavage | ponytail | earrings | upper_body | dress | small_breasts | artist_name | hair_ribbon | medium_breasts | holding_sword | full_moon | night | blue_dress | grey_hair | ribbon | parted_lips | sky | boots | armor | bare_legs | barefoot | elbow_gloves | feet | full_body | sitting | toes | hair_over_one_eye | jewelry | moon | black_footwear | knee_boots | high_heel_boots | simple_background | thighs | white_background | knee_up | large_breasts | white_skirt | blush | completely_nude | navel | nipples | collarbone | blue_hair | armpits | blue_ribbon | pussy | 1boy | cowgirl_position | hetero | mosaic_censoring | penis | solo_focus | girl_on_top | pov | sex | vaginal | open_mouth | blindfold | cum | night_sky | smile | star_(sky) | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:--------------------|:-------|:---------------|:-------------------|:--------------|:-----------|:-----------|:-----------|:-------------|:--------|:----------------|:--------------|:--------------|:-----------------|:----------------|:------------|:--------|:-------------|:------------|:---------|:--------------|:------|:--------|:--------|:------------|:-----------|:---------------|:-------|:------------|:----------|:-------|:--------------------|:----------|:-------|:-----------------|:-------------|:------------------|:--------------------|:---------|:-------------------|:----------|:----------------|:--------------|:--------|:------------------|:--------|:----------|:-------------|:------------|:----------|:--------------|:--------|:-------|:-------------------|:---------|:-------------------|:--------|:-------------|:--------------|:------|:------|:----------|:-------------|:------------|:------|:------------|:--------|:-------------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | X | | | | | X | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | | X | X | X | X | | X | | | | | | | | X | | | | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | X | X | | | | X | X | | | | | | | | | | X | X | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | X | X | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | X | | | X | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | | | X | | X | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/jingliu_starrail | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T06:53:49+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T09:34:05+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of jingliu/鏡流/镜流/경류 (Honkai: Star Rail)
===============================================
This is the dataset of jingliu/鏡流/镜流/경류 (Honkai: Star Rail), containing 500 images and their tags.
The core tags of this character are 'long\_hair, bangs, breasts, red\_eyes, white\_hair, hair\_between\_eyes, very\_long\_hair, hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1eba96e5e84278e1a692a97f38dfdf3423b53c32 | # Dataset Card for "cpm_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jay401521/cpm_test | [
"region:us"
] | 2024-01-13T07:00:40+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "options", "struct": [{"name": "<option_0>", "dtype": "string"}, {"name": "<option_1>", "dtype": "string"}]}, {"name": "<ans>", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4653361, "num_examples": 20000}], "download_size": 2631090, "dataset_size": 4653361}} | 2024-01-13T07:00:44+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cpm_test"
More Information needed | [
"# Dataset Card for \"cpm_test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cpm_test\"\n\nMore Information needed"
] |
bbc977f4960ac34cabe19ba590eba89965aa08fa |
# Dataset of montpelier/モントピリア/蒙彼利埃 (Azur Lane)
This is the dataset of montpelier/モントピリア/蒙彼利埃 (Azur Lane), containing 95 images and their tags.
The core tags of this character are `red_eyes, long_hair, bangs, hair_ornament, grey_hair, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 95 | 111.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/montpelier_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 95 | 64.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/montpelier_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 209 | 133.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/montpelier_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 95 | 99.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/montpelier_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 209 | 184.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/montpelier_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/montpelier_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, looking_at_viewer, bare_shoulders, closed_mouth, cleavage, collarbone, black_dress, blush, sleeveless_dress, small_breasts, white_background, bare_legs, choker, halterneck, medium_breasts, red_dress, simple_background, sitting, bare_arms, bow, full_body, high_heels, official_alternate_costume, red_footwear, tattoo |
| 1 | 40 |  |  |  |  |  | 1girl, looking_at_viewer, fingerless_gloves, black_gloves, solo, armband, shorts, blush, black_thighhighs, white_background, animal_ears, simple_background |
| 2 | 11 |  |  |  |  |  | 1girl, solo, hair_flower, looking_at_viewer, blush, oil-paper_umbrella, red_kimono, wide_sleeves, closed_mouth, holding_umbrella, petals, smile, floral_print, fur_collar, fur_trim, animal_ears, folding_fan, holding_fan, long_sleeves, new_year |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | bare_shoulders | closed_mouth | cleavage | collarbone | black_dress | blush | sleeveless_dress | small_breasts | white_background | bare_legs | choker | halterneck | medium_breasts | red_dress | simple_background | sitting | bare_arms | bow | full_body | high_heels | official_alternate_costume | red_footwear | tattoo | fingerless_gloves | black_gloves | armband | shorts | black_thighhighs | animal_ears | hair_flower | oil-paper_umbrella | red_kimono | wide_sleeves | holding_umbrella | petals | smile | floral_print | fur_collar | fur_trim | folding_fan | holding_fan | long_sleeves | new_year |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------------|:---------------|:-----------|:-------------|:--------------|:--------|:-------------------|:----------------|:-------------------|:------------|:---------|:-------------|:-----------------|:------------|:--------------------|:----------|:------------|:------|:------------|:-------------|:-----------------------------|:---------------|:---------|:--------------------|:---------------|:----------|:---------|:-------------------|:--------------|:--------------|:---------------------|:-------------|:---------------|:-------------------|:---------|:--------|:---------------|:-------------|:-----------|:--------------|:--------------|:---------------|:-----------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 40 |  |  |  |  |  | X | X | X | | | | | | X | | | X | | | | | | X | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/montpelier_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T07:12:40+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T07:35:30+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of montpelier/モントピリア/蒙彼利埃 (Azur Lane)
=============================================
This is the dataset of montpelier/モントピリア/蒙彼利埃 (Azur Lane), containing 95 images and their tags.
The core tags of this character are 'red\_eyes, long\_hair, bangs, hair\_ornament, grey\_hair, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7afe8245d447660f2c75c94aee1282b5d6b2f67a |
# Dataset of juno/ジュノー/天后 (Azur Lane)
This is the dataset of juno/ジュノー/天后 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `pink_hair, long_hair, crown, bangs, mini_crown, ribbon, twintails, pink_eyes, bow, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 29.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/juno_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 20.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/juno_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 59 | 42.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/juno_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 28.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/juno_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 59 | 57.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/juno_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/juno_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | looking_at_viewer, 1girl, solo, open_mouth, blush, collarbone, bare_shoulders, :d, dress, long_sleeves, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | solo | open_mouth | blush | collarbone | bare_shoulders | :d | dress | long_sleeves | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-------|:-------------|:--------|:-------------|:-----------------|:-----|:--------|:---------------|:--------------------|:-------------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/juno_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T07:42:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T07:48:30+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of juno/ジュノー/天后 (Azur Lane)
===================================
This is the dataset of juno/ジュノー/天后 (Azur Lane), containing 24 images and their tags.
The core tags of this character are 'pink\_hair, long\_hair, crown, bangs, mini\_crown, ribbon, twintails, pink\_eyes, bow, purple\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
f3810de99d94287f24e6bc9828e0689acfe7bfcb |
# Dataset of washington/ワシントン/华盛顿 (Azur Lane)
This is the dataset of washington/ワシントン/华盛顿 (Azur Lane), containing 127 images and their tags.
The core tags of this character are `blue_eyes, breasts, large_breasts, short_hair, grey_hair, hair_between_eyes, mole, mole_on_breast, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 127 | 177.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/washington_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 127 | 93.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/washington_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 322 | 204.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/washington_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 127 | 151.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/washington_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 322 | 298.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/washington_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/washington_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, black_gloves, cleavage, cross_necklace, jacket_on_shoulders, midriff, navel, pantyhose, skirt, solo, thigh_boots, thighhighs, choker, smile, looking_at_viewer, white_background, suspenders, simple_background, white_hair |
| 1 | 9 |  |  |  |  |  | 1girl, black_gloves, choker, cleavage, cross_necklace, jacket_on_shoulders, looking_at_viewer, smile, solo, midriff, blush, navel, pantyhose, skirt |
| 2 | 5 |  |  |  |  |  | 1girl, black_gloves, choker, cleavage, cross_necklace, solo, earrings, jacket, looking_at_viewer, smile, blush, simple_background, upper_body, white_background, white_hair |
| 3 | 17 |  |  |  |  |  | rabbit_ears, fake_animal_ears, bare_shoulders, blush, cleavage, detached_collar, looking_at_viewer, navel, playboy_bunny, pantyhose, 1girl, black_gloves, black_necktie, midriff, solo, wrist_cuffs, simple_background, suspenders, collarbone, elbow_gloves, necktie_between_breasts, white_background, cowboy_shot, white_hair, black_shorts, half_gloves, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | cleavage | cross_necklace | jacket_on_shoulders | midriff | navel | pantyhose | skirt | solo | thigh_boots | thighhighs | choker | smile | looking_at_viewer | white_background | suspenders | simple_background | white_hair | blush | earrings | jacket | upper_body | rabbit_ears | fake_animal_ears | bare_shoulders | detached_collar | playboy_bunny | black_necktie | wrist_cuffs | collarbone | elbow_gloves | necktie_between_breasts | cowboy_shot | black_shorts | half_gloves | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:-----------------|:----------------------|:----------|:--------|:------------|:--------|:-------|:--------------|:-------------|:---------|:--------|:--------------------|:-------------------|:-------------|:--------------------|:-------------|:--------|:-----------|:---------|:-------------|:--------------|:-------------------|:-----------------|:------------------|:----------------|:----------------|:--------------|:-------------|:---------------|:--------------------------|:--------------|:---------------|:--------------|:-----------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | X | X | X | | | | | X | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | | | | | X | | | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 17 |  |  |  |  |  | X | X | X | | | X | X | X | | X | | | | | X | X | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/washington_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T07:43:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T08:14:52+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of washington/ワシントン/华盛顿 (Azur Lane)
===========================================
This is the dataset of washington/ワシントン/华盛顿 (Azur Lane), containing 127 images and their tags.
The core tags of this character are 'blue\_eyes, breasts, large\_breasts, short\_hair, grey\_hair, hair\_between\_eyes, mole, mole\_on\_breast, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
05357971ce5d3a3ad20111f348715f89c889a6b8 |
# Dataset of unzen/雲仙/云仙 (Azur Lane)
This is the dataset of unzen/雲仙/云仙 (Azur Lane), containing 125 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, white_hair, purple_eyes, hair_over_one_eye, multicolored_hair, bangs, streaked_hair, very_long_hair, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 273.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 125 | 123.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 341 | 280.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 125 | 226.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 341 | 446.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/unzen_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, purple_bikini, navel, thighs, blush, purple_hair, simple_background, white_background, smile, purple_choker, collarbone, highleg_bikini, o-ring_bikini, bare_shoulders |
| 1 | 5 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, one_eye_covered, solo, blue_butterfly, japanese_clothes, white_panties, wide_sleeves, animal_ears, blush, thighs |
| 2 | 27 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, wide_sleeves, one_eye_covered, holding_sword, katana, thigh_strap, dress, japanese_clothes, thighs, sheath, blue_butterfly, between_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | solo | purple_bikini | navel | thighs | blush | purple_hair | simple_background | white_background | smile | purple_choker | collarbone | highleg_bikini | o-ring_bikini | bare_shoulders | one_eye_covered | blue_butterfly | japanese_clothes | white_panties | wide_sleeves | animal_ears | holding_sword | katana | thigh_strap | dress | sheath | between_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:----------------|:--------|:---------|:--------|:--------------|:--------------------|:-------------------|:--------|:----------------|:-------------|:-----------------|:----------------|:-----------------|:------------------|:-----------------|:-------------------|:----------------|:---------------|:--------------|:----------------|:---------|:--------------|:--------|:---------|:------------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | | X | X | | | | | | | | | | X | X | X | X | X | X | | | | | | |
| 2 | 27 |  |  |  |  |  | X | X | X | X | | | X | | | | | | | | | | | X | X | X | | X | | X | X | X | X | X | X |
| CyberHarem/unzen_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T08:16:44+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T08:55:29+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of unzen/雲仙/云仙 (Azur Lane)
==================================
This is the dataset of unzen/雲仙/云仙 (Azur Lane), containing 125 images and their tags.
The core tags of this character are 'breasts, long\_hair, large\_breasts, white\_hair, purple\_eyes, hair\_over\_one\_eye, multicolored\_hair, bangs, streaked\_hair, very\_long\_hair, ponytail', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6c9abb74101f39ef07e3b84ef72fbc87f1bddf69 |
# Dataset of huohuo/フォフォ/藿藿/곽향 (Honkai: Star Rail)
This is the dataset of huohuo/フォフォ/藿藿/곽향 (Honkai: Star Rail), containing 137 images and their tags.
The core tags of this character are `green_hair, long_hair, bangs, ahoge, hair_ornament, hat, animal_ears, green_eyes, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 137 | 275.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/huohuo_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 137 | 128.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/huohuo_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 352 | 296.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/huohuo_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 137 | 228.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/huohuo_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 352 | 463.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/huohuo_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/huohuo_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, long_sleeves, solo, bare_shoulders, holding, looking_at_viewer, open_mouth, shirt, shorts, blush, detached_sleeves, sleeveless, simple_background, white_background, blue_headwear, full_body, red_ribbon, white_socks, knees_up, off_shoulder |
| 1 | 13 |  |  |  |  |  | 1girl, long_sleeves, solo, looking_at_viewer, shorts, holding, open_mouth, blush, closed_mouth |
| 2 | 9 |  |  |  |  |  | 1girl, solo, long_sleeves, shoes, shorts, full_body, looking_at_viewer, white_socks, black_footwear, open_mouth, holding, loose_socks, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | bare_shoulders | holding | looking_at_viewer | open_mouth | shirt | shorts | blush | detached_sleeves | sleeveless | simple_background | white_background | blue_headwear | full_body | red_ribbon | white_socks | knees_up | off_shoulder | closed_mouth | shoes | black_footwear | loose_socks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:-----------------|:----------|:--------------------|:-------------|:--------|:---------|:--------|:-------------------|:-------------|:--------------------|:-------------------|:----------------|:------------|:-------------|:--------------|:-----------|:---------------|:---------------|:--------|:-----------------|:--------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | | | | | | | | | | | X | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | | X | X | X | | X | | | | X | X | | X | | X | | | | X | X | X |
| CyberHarem/huohuo_starrail | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T08:22:38+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T08:57:07+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of huohuo/フォフォ/藿藿/곽향 (Honkai: Star Rail)
================================================
This is the dataset of huohuo/フォフォ/藿藿/곽향 (Honkai: Star Rail), containing 137 images and their tags.
The core tags of this character are 'green\_hair, long\_hair, bangs, ahoge, hair\_ornament, hat, animal\_ears, green\_eyes, blue\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
8282467b9168e013ded0c4c78dd0067f3b0bf713 | # Dataset Card for "audioset_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Codec-SUPERB/audioset_synth | [
"region:us"
] | 2024-01-13T08:25:15+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "original", "path": "data/original-*"}, {"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k_12bps", "path": "data/encodec_24k_12bps-*"}, {"split": "encodec_24k_1_5bps", "path": "data/encodec_24k_1_5bps-*"}, {"split": "encodec_24k_24bps", "path": "data/encodec_24k_24bps-*"}, {"split": "encodec_24k_3bps", "path": "data/encodec_24k_3bps-*"}, {"split": "encodec_24k_6bps", "path": "data/encodec_24k_6bps-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "id", "dtype": "string"}], "splits": [{"name": "original", "num_bytes": 6377472402.0, "num_examples": 20111}, {"name": "academicodec_hifi_16k_320d", "num_bytes": 6367615153.0, "num_examples": 20111}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 6367615153.0, "num_examples": 20111}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 9562795313.0, "num_examples": 20111}, {"name": "audiodec_24k_320d", "num_bytes": 9553853730.0, "num_examples": 20111}, {"name": "dac_16k", "num_bytes": 6377489897.0, "num_examples": 20111}, {"name": "dac_24k", "num_bytes": 9565601505.0, "num_examples": 20111}, {"name": "dac_44k", "num_bytes": 17575747599.0, "num_examples": 20111}, {"name": "encodec_24k_12bps", "num_bytes": 9565601505.0, "num_examples": 20111}, {"name": "encodec_24k_1_5bps", "num_bytes": 9565601505.0, "num_examples": 20111}, {"name": "encodec_24k_24bps", "num_bytes": 9565601505.0, "num_examples": 20111}, {"name": "encodec_24k_3bps", "num_bytes": 9565601505.0, "num_examples": 20111}, {"name": "encodec_24k_6bps", "num_bytes": 9565601505.0, "num_examples": 20111}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 6373863275.0, "num_examples": 20111}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 6373863275.0, "num_examples": 20111}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 6377489897.0, "num_examples": 20111}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 6377489897.0, "num_examples": 20111}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 6377489897.0, "num_examples": 20111}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 6377489897.0, "num_examples": 20111}, {"name": "speech_tokenizer_16k", "num_bytes": 6380297393.0, "num_examples": 20111}], "download_size": 160336452822, "dataset_size": 164214181808.0}} | 2024-01-28T21:51:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "audioset_synth"
More Information needed | [
"# Dataset Card for \"audioset_synth\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"audioset_synth\"\n\nMore Information needed"
] |
715e95f4b78acf6208f19c5f32cf017c278af265 | # Dataset Card for "c_x86_simd_extension"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/c_x86_simd_extension | [
"region:us"
] | 2024-01-13T08:57:48+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 741888, "num_examples": 540}], "download_size": 133783, "dataset_size": 741888}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T08:57:52+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "c_x86_simd_extension"
More Information needed | [
"# Dataset Card for \"c_x86_simd_extension\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"c_x86_simd_extension\"\n\nMore Information needed"
] |
78cb4135a36abf8b01459f4640b04a6630b1fede | # Dataset Card for "c_x86_simd_extension_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/c_x86_simd_extension_filtered | [
"region:us"
] | 2024-01-13T09:00:29+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 611556.5185185185, "num_examples": 428}], "download_size": 70159, "dataset_size": 611556.5185185185}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T16:13:54+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "c_x86_simd_extension_filtered"
More Information needed | [
"# Dataset Card for \"c_x86_simd_extension_filtered\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"c_x86_simd_extension_filtered\"\n\nMore Information needed"
] |
f438f6720b730d5aced9dac15c08345e280ae0c1 | # Dataset Card for "medical-dpo-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mii-llm/medical-ita-dpo-small | [
"region:us"
] | 2024-01-13T09:10:37+00:00 | {"dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "rejected_model", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1901764, "num_examples": 361}], "download_size": 1022489, "dataset_size": 1901764}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T09:11:17+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "medical-dpo-small"
More Information needed | [
"# Dataset Card for \"medical-dpo-small\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"medical-dpo-small\"\n\nMore Information needed"
] |
6aaca1abcad1eca3ae0eb6c6c786bd79c16a0eb0 |
# Dataset of guinaifen/桂乃芬/桂乃芬/계네빈 (Honkai: Star Rail)
This is the dataset of guinaifen/桂乃芬/桂乃芬/계네빈 (Honkai: Star Rail), containing 59 images and their tags.
The core tags of this character are `long_hair, hair_ornament, yellow_eyes, bangs, breasts, hair_between_eyes, side_ponytail, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 59 | 104.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinaifen_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 59 | 49.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinaifen_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 140 | 107.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinaifen_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 59 | 88.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinaifen_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 140 | 170.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinaifen_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/guinaifen_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, open_mouth, looking_at_viewer, solo, :d, black_gloves, white_background, bare_shoulders, simple_background, orange_hair, red_dress, chinese_clothes, choker, flower, blush |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | looking_at_viewer | solo | :d | black_gloves | white_background | bare_shoulders | simple_background | orange_hair | red_dress | chinese_clothes | choker | flower | blush |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:-------|:-----|:---------------|:-------------------|:-----------------|:--------------------|:--------------|:------------|:------------------|:---------|:---------|:--------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/guinaifen_starrail | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T09:18:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T09:34:04+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of guinaifen/桂乃芬/桂乃芬/계네빈 (Honkai: Star Rail)
====================================================
This is the dataset of guinaifen/桂乃芬/桂乃芬/계네빈 (Honkai: Star Rail), containing 59 images and their tags.
The core tags of this character are 'long\_hair, hair\_ornament, yellow\_eyes, bangs, breasts, hair\_between\_eyes, side\_ponytail, hair\_flower', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
66ce9ce551e8e857de7f6a052d32f7c029f09a11 |
# Dataset of hanya/寒鴉/寒鸦/한아 (Honkai: Star Rail)
This is the dataset of hanya/寒鴉/寒鸦/한아 (Honkai: Star Rail), containing 56 images and their tags.
The core tags of this character are `bangs, long_hair, breasts, hair_between_eyes, hair_ornament, large_breasts, blue_eyes, grey_hair, horns, blue_hair, grey_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 56 | 108.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanya_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 56 | 51.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanya_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 143 | 113.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanya_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 56 | 90.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanya_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 143 | 168.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanya_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hanya_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, looking_at_viewer, solo, cleavage, elbow_gloves, holding, parted_lips, sitting, thighs, jewelry, blue_dress |
| 1 | 5 |  |  |  |  |  | 1girl, holding, looking_at_viewer, simple_background, solo, bare_shoulders, black_gloves, closed_mouth, dress, jewelry, white_background, cleavage, elbow_gloves, upper_body, tassel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_gloves | looking_at_viewer | solo | cleavage | elbow_gloves | holding | parted_lips | sitting | thighs | jewelry | blue_dress | simple_background | closed_mouth | dress | white_background | upper_body | tassel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:--------------------|:-------|:-----------|:---------------|:----------|:--------------|:----------|:---------|:----------|:-------------|:--------------------|:---------------|:--------|:-------------------|:-------------|:---------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | X | | X | X | X | X | X | X |
| CyberHarem/hanya_starrail | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T09:19:50+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T09:33:35+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of hanya/寒鴉/寒鸦/한아 (Honkai: Star Rail)
=============================================
This is the dataset of hanya/寒鴉/寒鸦/한아 (Honkai: Star Rail), containing 56 images and their tags.
The core tags of this character are 'bangs, long\_hair, breasts, hair\_between\_eyes, hair\_ornament, large\_breasts, blue\_eyes, grey\_hair, horns, blue\_hair, grey\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
4840010470899a5c5bc6cbf02ac2b0259db50cf3 |
# Dataset of hook/フック/虎克/후크 (Honkai: Star Rail)
This is the dataset of hook/フック/虎克/후크 (Honkai: Star Rail), containing 43 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hat, yellow_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 51.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hook_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 43 | 30.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hook_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 97 | 64.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hook_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 43 | 46.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hook_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 97 | 89.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hook_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hook_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, solo, fur_trim, jacket, open_mouth, black_gloves, looking_at_viewer, long_sleeves, blush, twintails, simple_background, :d, shorts |
| 1 | 6 |  |  |  |  |  | 1boy, 1girl, hetero, loli, solo_focus, open_mouth, penis, black_gloves, blush, sex, bar_censor, pussy, tongue |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | fur_trim | jacket | open_mouth | black_gloves | looking_at_viewer | long_sleeves | blush | twintails | simple_background | :d | shorts | 1boy | hetero | loli | solo_focus | penis | sex | bar_censor | pussy | tongue |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:---------|:-------------|:---------------|:--------------------|:---------------|:--------|:------------|:--------------------|:-----|:---------|:-------|:---------|:-------|:-------------|:--------|:------|:-------------|:--------|:---------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | | X | X | | | X | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/hook_starrail | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T09:42:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T10:02:23+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of hook/フック/虎克/후크 (Honkai: Star Rail)
=============================================
This is the dataset of hook/フック/虎克/후크 (Honkai: Star Rail), containing 43 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, hat, yellow\_eyes, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
232d4f7dca5125d1d06c1e818c262002b0bd2852 |
<div align="center">
<img src="Yi_logo.svg" width="150px" style="display: inline-block;">
<img src="m-a-p.png" width="150px" style="display: inline-block;">
</div>
## SMuPT: Symbolic Music Generative Pre-trained Transformer
SMuPT is a series of pre-trained models for symbolic music generation. It was trained on a large-scale dataset of symbolic music, including millions of monophonic and polyphonic pieces from different genres and styles. The models are trained with the LLama2 architecture, and can be further used for downstream music generation tasks such as melody generation, accompaniment generation, and multi-track music generation.
- 09/01/2024: a series of pre-trained SMuPT models are released, with parameters ranging from 110M to 1.3B.
## Model architecture
The details of model architecture of SMuPT-v0 are listed below:
| Name | Parameters | Training Data(Music Pieces) | Seq Length | Hidden Size | Layers | Heads |
| :--- | :---: | :---: | :---: | :---: | :---: | :---: |
| SMuPT-v0-8192-110M | 110M | 7M x 5.8 epochs | 8192 | 768 | 12 | 12 |
| SMuPT-v0-8192-345M | 345M | 7M x 4 epochs | 8192 | 1024 | 24 | 16 |
| SMuPT-v0-8192-770M | 770M | 7M x 3 epochs | 8192 | 1280 | 36 | 20 |
| SMuPT-v0-8192-1.3B | 1.3B | 7M x 2.2 epochs | 8192 | 1536 | 48 | 24 |
## Model Usage
There are several ways to use our pre-trained SMuPT models, we now the usage based on [Megatron-LM](https://github.com/NVIDIA/Megatron-LM/tree/main). Huggingface format will be supported soon.
Before starting, make sure you have setup the relevant environment and codebase.
```shell
# pull Megatron-LM codebase
mkdir -p /path/to/workspace && cd /path/to/workspace
git clone https://github.com/NVIDIA/Megatron-LM.git
# download the pre-trained SMuPT models checkpoint and vocab files from Huggingface page
mkdir -p /models/SMuPT_v0_8192_1.3B && cd /models/SMuPT_v0_8192_1.3B
wget -O model_optim_rng.pt https://huggingface.co/m-a-p/SMuPT_v0_8192_1.3B/resolve/main/model_optim_rng.pt?download=true
wget -O newline.vocab https://huggingface.co/m-a-p/SMuPT_v0_8192_1.3B/resolve/main/newline.vocab?download=true
wget -O newline.txt https://huggingface.co/m-a-p/SMuPT_v0_8192_1.3B/resolve/main/newline.txt?download=true
```
We recommend using the latest version of [NGC's PyTorch container](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch) for SMuPT inference. See more details in [Megatron-LM](https://github.com/NVIDIA/Megatron-LM/tree/main)
```shell
# pull the latest NGC's PyTorch container, mount the workspace directory and enter the container
docker run --gpus all -it --name megatron --shm-size=16g -v $PWD:/workspace -p 5000:5000 nvcr.io/nvidia/pytorch:23.11-py3 /bin/bash
```
Once you enter the container, you can start a REST server for inference.
<details>
<summary>Click to expand the example script</summary>
#!/bin/bash
# This example will start serving the 1.3B model.
export CUDA_DEVICE_MAX_CONNECTIONS=1
DISTRIBUTED_ARGS="--nproc_per_node 1 \
--nnodes 1 \
--node_rank 0 \
--master_addr localhost \
--master_port 6000"
CHECKPOINT=/path/to/model/checkpoint/folder
VOCAB_FILE=/path/to/vocab/file
MERGE_FILE=/path/to/merge/file
MODEL_SIZE="1.3B"
if [[ ${MODEL_SIZE} == "110M" ]]; then HIDDEN_SIZE=768; NUM_HEAD=12; NUM_QUERY_GROUP=12; NUM_LAYERS=12; FFN_HIDDEN_SIZE=3072; NORM_EPS=1e-5;
elif [[ ${MODEL_SIZE} == "345M" ]]; then HIDDEN_SIZE=1024; NUM_HEAD=16; NUM_QUERY_GROUP=16; NUM_LAYERS=24; FFN_HIDDEN_SIZE=4096; NORM_EPS=1e-5;
elif [[ ${MODEL_SIZE} == "770M" ]]; then HIDDEN_SIZE=1280; NUM_HEAD=20; NUM_QUERY_GROUP=20; NUM_LAYERS=36; FFN_HIDDEN_SIZE=5120; NORM_EPS=1e-5;
elif [[ ${MODEL_SIZE} == "1.3B" ]]; then HIDDEN_SIZE=1536; NUM_HEAD=24; NUM_QUERY_GROUP=24; NUM_LAYERS=48; FFN_HIDDEN_SIZE=6144; NORM_EPS=1e-5;
else echo "invalid MODEL_SIZE: ${MODEL_SIZE}"; exit 1
fi
MAX_SEQ_LEN=8192
MAX_POSITION_EMBEDDINGS=8192
pip install flask-restful
torchrun $DISTRIBUTED_ARGS tools/run_text_generation_server.py \
--tensor-model-parallel-size 1 \
--pipeline-model-parallel-size 1 \
--num-layers ${NUM_LAYERS} \
--hidden-size ${HIDDEN_SIZE} \
--ffn-hidden-size ${FFN_HIDDEN_SIZE} \
--load ${CHECKPOINT} \
--group-query-attention \
--num-query-groups ${NUM_QUERY_GROUP} \
--position-embedding-type rope \
--num-attention-heads ${NUM_HEAD} \
--max-position-embeddings ${MAX_POSITION_EMBEDDINGS} \
--tokenizer-type GPT2BPETokenizer \
--normalization RMSNorm \
--norm-epsilon ${NORM_EPS} \
--make-vocab-size-divisible-by 1 \
--swiglu \
--use-flash-attn \
--bf16 \
--micro-batch-size 1 \
--disable-bias-linear \
--no-bias-gelu-fusion \
--untie-embeddings-and-output-weights \
--seq-length ${MAX_SEQ_LEN} \
--vocab-file $VOCAB_FILE \
--merge-file $MERGE_FILE \
--attention-dropout 0.0 \
--hidden-dropout 0.0 \
--weight-decay 1e-1 \
--clip-grad 1.0 \
--adam-beta1 0.9 \
--adam-beta2 0.95 \
--adam-eps 1e-8 \
--seed 42
</details>
Use CURL to query the server directly, note that the newline token `\n` is represented by `<n>` in the vocabulary, so we need to replace the newline token with `<n>` in both the prompt and the generated tokens.
```shell
curl 'http://localhost:6000/api' -X 'PUT' -H 'Content-Type: application/json; charset=UTF-8' -d '{"prompts":["X:1<n>L:1/8<n>M:4/4<n>K:G<n>GA"], "tokens_to_generate":4096}'
```
Processed Output:
```shell
X:1
L:1/8
M:4/4
K:G
GA | B2 B2 B2 (cd) | B2 A2 z2 AB | c2 c2 c2 (de) | d4 z2 B2 | d2 d2 d2 e>d | c2 B2 z2 dB |
A2 A2 A2 B2 | G4 z2 GA | B2 B2 B2 cd | B2 A2 z2 AB | c2 c2 e2 dc | d4 z2 GA | B2 B2 B2 cd |
B2 A2 z2 dB | A3 G A2 B2 | G4 z2 |]
```
Once you encode the generated tokens into audio, you will hear the following music.
<audio controls src="https://cdn-uploads.huggingface.co/production/uploads/640701cb4dc5f2846c91d4eb/Ows-HvaSuZfqAZvOjT4LX.mpga"></audio> | m-a-p/SMuPT_v0_8192_770M | [
"language:en",
"license:apache-2.0",
"music",
"art",
"region:us"
] | 2024-01-13T09:56:51+00:00 | {"language": ["en"], "license": "apache-2.0", "pipeline_tag": "text-generation", "tags": ["music", "art"]} | 2024-01-13T09:58:06+00:00 | [] | [
"en"
] | TAGS
#language-English #license-apache-2.0 #music #art #region-us
|


SMuPT: Symbolic Music Generative Pre-trained Transformer
--------------------------------------------------------
SMuPT is a series of pre-trained models for symbolic music generation. It was trained on a large-scale dataset of symbolic music, including millions of monophonic and polyphonic pieces from different genres and styles. The models are trained with the LLama2 architecture, and can be further used for downstream music generation tasks such as melody generation, accompaniment generation, and multi-track music generation.
* 09/01/2024: a series of pre-trained SMuPT models are released, with parameters ranging from 110M to 1.3B.
Model architecture
------------------
The details of model architecture of SMuPT-v0 are listed below:
Model Usage
-----------
There are several ways to use our pre-trained SMuPT models, we now the usage based on Megatron-LM. Huggingface format will be supported soon.
Before starting, make sure you have setup the relevant environment and codebase.
We recommend using the latest version of NGC's PyTorch container for SMuPT inference. See more details in Megatron-LM
Once you enter the container, you can start a REST server for inference.
Click to expand the example script
```
#!/bin/bash
# This example will start serving the 1.3B model.
export CUDA_DEVICE_MAX_CONNECTIONS=1
DISTRIBUTED_ARGS="--nproc_per_node 1 \
--nnodes 1 \
--node_rank 0 \
--master_addr localhost \
--master_port 6000"
CHECKPOINT=/path/to/model/checkpoint/folder
VOCAB_FILE=/path/to/vocab/file
MERGE_FILE=/path/to/merge/file
MODEL_SIZE="1.3B"
if [[ ${MODEL_SIZE} == "110M" ]]; then HIDDEN_SIZE=768; NUM_HEAD=12; NUM_QUERY_GROUP=12; NUM_LAYERS=12; FFN_HIDDEN_SIZE=3072; NORM_EPS=1e-5;
elif [[ ${MODEL_SIZE} == "345M" ]]; then HIDDEN_SIZE=1024; NUM_HEAD=16; NUM_QUERY_GROUP=16; NUM_LAYERS=24; FFN_HIDDEN_SIZE=4096; NORM_EPS=1e-5;
elif [[ ${MODEL_SIZE} == "770M" ]]; then HIDDEN_SIZE=1280; NUM_HEAD=20; NUM_QUERY_GROUP=20; NUM_LAYERS=36; FFN_HIDDEN_SIZE=5120; NORM_EPS=1e-5;
elif [[ ${MODEL_SIZE} == "1.3B" ]]; then HIDDEN_SIZE=1536; NUM_HEAD=24; NUM_QUERY_GROUP=24; NUM_LAYERS=48; FFN_HIDDEN_SIZE=6144; NORM_EPS=1e-5;
else echo "invalid MODEL_SIZE: ${MODEL_SIZE}"; exit 1
fi
MAX_SEQ_LEN=8192
MAX_POSITION_EMBEDDINGS=8192
pip install flask-restful
torchrun $DISTRIBUTED_ARGS tools/run_text_generation_server.py \
--tensor-model-parallel-size 1 \
--pipeline-model-parallel-size 1 \
--num-layers ${NUM_LAYERS} \
--hidden-size ${HIDDEN_SIZE} \
--ffn-hidden-size ${FFN_HIDDEN_SIZE} \
--load ${CHECKPOINT} \
--group-query-attention \
--num-query-groups ${NUM_QUERY_GROUP} \
--position-embedding-type rope \
--num-attention-heads ${NUM_HEAD} \
--max-position-embeddings ${MAX_POSITION_EMBEDDINGS} \
--tokenizer-type GPT2BPETokenizer \
--normalization RMSNorm \
--norm-epsilon ${NORM_EPS} \
--make-vocab-size-divisible-by 1 \
--swiglu \
--use-flash-attn \
--bf16 \
--micro-batch-size 1 \
--disable-bias-linear \
--no-bias-gelu-fusion \
--untie-embeddings-and-output-weights \
--seq-length ${MAX_SEQ_LEN} \
--vocab-file $VOCAB_FILE \
--merge-file $MERGE_FILE \
--attention-dropout 0.0 \
--hidden-dropout 0.0 \
--weight-decay 1e-1 \
--clip-grad 1.0 \
--adam-beta1 0.9 \
--adam-beta2 0.95 \
--adam-eps 1e-8 \
--seed 42
```
Use CURL to query the server directly, note that the newline token '\n' is represented by '' in the vocabulary, so we need to replace the newline token with '' in both the prompt and the generated tokens.
Processed Output:
Once you encode the generated tokens into audio, you will hear the following music.
<audio controls src="URL
| [
"# This example will start serving the 1.3B model.\nexport CUDA_DEVICE_MAX_CONNECTIONS=1\n\nDISTRIBUTED_ARGS=\"--nproc_per_node 1 \\\n --nnodes 1 \\\n --node_rank 0 \\\n --master_addr localhost \\\n --master_port 6000\"\n\nCHECKPOINT=/path/to/model/checkpoint/folder\nVOCAB_FILE=/path/to/vocab/file\nMERGE_FILE=/path/to/merge/file\n\nMODEL_SIZE=\"1.3B\"\nif [[ ${MODEL_SIZE} == \"110M\" ]]; then HIDDEN_SIZE=768; NUM_HEAD=12; NUM_QUERY_GROUP=12; NUM_LAYERS=12; FFN_HIDDEN_SIZE=3072; NORM_EPS=1e-5;\nelif [[ ${MODEL_SIZE} == \"345M\" ]]; then HIDDEN_SIZE=1024; NUM_HEAD=16; NUM_QUERY_GROUP=16; NUM_LAYERS=24; FFN_HIDDEN_SIZE=4096; NORM_EPS=1e-5;\nelif [[ ${MODEL_SIZE} == \"770M\" ]]; then HIDDEN_SIZE=1280; NUM_HEAD=20; NUM_QUERY_GROUP=20; NUM_LAYERS=36; FFN_HIDDEN_SIZE=5120; NORM_EPS=1e-5;\nelif [[ ${MODEL_SIZE} == \"1.3B\" ]]; then HIDDEN_SIZE=1536; NUM_HEAD=24; NUM_QUERY_GROUP=24; NUM_LAYERS=48; FFN_HIDDEN_SIZE=6144; NORM_EPS=1e-5;\nelse echo \"invalid MODEL_SIZE: ${MODEL_SIZE}\"; exit 1\nfi\nMAX_SEQ_LEN=8192\nMAX_POSITION_EMBEDDINGS=8192\n\npip install flask-restful\n\ntorchrun $DISTRIBUTED_ARGS tools/run_text_generation_server.py \\\n --tensor-model-parallel-size 1 \\\n --pipeline-model-parallel-size 1 \\\n --num-layers ${NUM_LAYERS} \\\n --hidden-size ${HIDDEN_SIZE} \\\n --ffn-hidden-size ${FFN_HIDDEN_SIZE} \\\n --load ${CHECKPOINT} \\\n --group-query-attention \\\n --num-query-groups ${NUM_QUERY_GROUP} \\\n --position-embedding-type rope \\\n --num-attention-heads ${NUM_HEAD} \\\n --max-position-embeddings ${MAX_POSITION_EMBEDDINGS} \\\n --tokenizer-type GPT2BPETokenizer \\\n --normalization RMSNorm \\\n --norm-epsilon ${NORM_EPS} \\\n --make-vocab-size-divisible-by 1 \\\n --swiglu \\\n --use-flash-attn \\\n --bf16 \\\n --micro-batch-size 1 \\\n --disable-bias-linear \\\n --no-bias-gelu-fusion \\\n --untie-embeddings-and-output-weights \\\n --seq-length ${MAX_SEQ_LEN} \\\n --vocab-file $VOCAB_FILE \\\n --merge-file $MERGE_FILE \\\n --attention-dropout 0.0 \\\n --hidden-dropout 0.0 \\\n --weight-decay 1e-1 \\\n --clip-grad 1.0 \\\n --adam-beta1 0.9 \\\n --adam-beta2 0.95 \\\n --adam-eps 1e-8 \\\n --seed 42\n\n```\n\n\nUse CURL to query the server directly, note that the newline token '\\n' is represented by '' in the vocabulary, so we need to replace the newline token with '' in both the prompt and the generated tokens.\n\n\nProcessed Output:\n\n\nOnce you encode the generated tokens into audio, you will hear the following music.\n\n\n<audio controls src=\"URL"
] | [
"TAGS\n#language-English #license-apache-2.0 #music #art #region-us \n",
"# This example will start serving the 1.3B model.\nexport CUDA_DEVICE_MAX_CONNECTIONS=1\n\nDISTRIBUTED_ARGS=\"--nproc_per_node 1 \\\n --nnodes 1 \\\n --node_rank 0 \\\n --master_addr localhost \\\n --master_port 6000\"\n\nCHECKPOINT=/path/to/model/checkpoint/folder\nVOCAB_FILE=/path/to/vocab/file\nMERGE_FILE=/path/to/merge/file\n\nMODEL_SIZE=\"1.3B\"\nif [[ ${MODEL_SIZE} == \"110M\" ]]; then HIDDEN_SIZE=768; NUM_HEAD=12; NUM_QUERY_GROUP=12; NUM_LAYERS=12; FFN_HIDDEN_SIZE=3072; NORM_EPS=1e-5;\nelif [[ ${MODEL_SIZE} == \"345M\" ]]; then HIDDEN_SIZE=1024; NUM_HEAD=16; NUM_QUERY_GROUP=16; NUM_LAYERS=24; FFN_HIDDEN_SIZE=4096; NORM_EPS=1e-5;\nelif [[ ${MODEL_SIZE} == \"770M\" ]]; then HIDDEN_SIZE=1280; NUM_HEAD=20; NUM_QUERY_GROUP=20; NUM_LAYERS=36; FFN_HIDDEN_SIZE=5120; NORM_EPS=1e-5;\nelif [[ ${MODEL_SIZE} == \"1.3B\" ]]; then HIDDEN_SIZE=1536; NUM_HEAD=24; NUM_QUERY_GROUP=24; NUM_LAYERS=48; FFN_HIDDEN_SIZE=6144; NORM_EPS=1e-5;\nelse echo \"invalid MODEL_SIZE: ${MODEL_SIZE}\"; exit 1\nfi\nMAX_SEQ_LEN=8192\nMAX_POSITION_EMBEDDINGS=8192\n\npip install flask-restful\n\ntorchrun $DISTRIBUTED_ARGS tools/run_text_generation_server.py \\\n --tensor-model-parallel-size 1 \\\n --pipeline-model-parallel-size 1 \\\n --num-layers ${NUM_LAYERS} \\\n --hidden-size ${HIDDEN_SIZE} \\\n --ffn-hidden-size ${FFN_HIDDEN_SIZE} \\\n --load ${CHECKPOINT} \\\n --group-query-attention \\\n --num-query-groups ${NUM_QUERY_GROUP} \\\n --position-embedding-type rope \\\n --num-attention-heads ${NUM_HEAD} \\\n --max-position-embeddings ${MAX_POSITION_EMBEDDINGS} \\\n --tokenizer-type GPT2BPETokenizer \\\n --normalization RMSNorm \\\n --norm-epsilon ${NORM_EPS} \\\n --make-vocab-size-divisible-by 1 \\\n --swiglu \\\n --use-flash-attn \\\n --bf16 \\\n --micro-batch-size 1 \\\n --disable-bias-linear \\\n --no-bias-gelu-fusion \\\n --untie-embeddings-and-output-weights \\\n --seq-length ${MAX_SEQ_LEN} \\\n --vocab-file $VOCAB_FILE \\\n --merge-file $MERGE_FILE \\\n --attention-dropout 0.0 \\\n --hidden-dropout 0.0 \\\n --weight-decay 1e-1 \\\n --clip-grad 1.0 \\\n --adam-beta1 0.9 \\\n --adam-beta2 0.95 \\\n --adam-eps 1e-8 \\\n --seed 42\n\n```\n\n\nUse CURL to query the server directly, note that the newline token '\\n' is represented by '' in the vocabulary, so we need to replace the newline token with '' in both the prompt and the generated tokens.\n\n\nProcessed Output:\n\n\nOnce you encode the generated tokens into audio, you will hear the following music.\n\n\n<audio controls src=\"URL"
] |
1bf7b307d47136343f75bc4b2f05c2cbcb72eec9 |
# Dataset of chen_hai/鎮海/镇海 (Azur Lane)
This is the dataset of chen_hai/鎮海/镇海 (Azur Lane), containing 132 images and their tags.
The core tags of this character are `black_hair, breasts, large_breasts, hair_ornament, long_hair, bangs, purple_eyes, red_eyes, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 132 | 243.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 132 | 112.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 331 | 245.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 132 | 200.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 331 | 366.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chen_hai_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, bodystocking, china_dress, elbow_gloves, lace-trimmed_gloves, looking_at_viewer, official_alternate_costume, pantyhose, solo, taut_dress, black_rose, black_gloves, brown_gloves, cleavage, white_background, blush, parted_lips, simple_background, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_dress | bodystocking | china_dress | elbow_gloves | lace-trimmed_gloves | looking_at_viewer | official_alternate_costume | pantyhose | solo | taut_dress | black_rose | black_gloves | brown_gloves | cleavage | white_background | blush | parted_lips | simple_background | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:---------------|:--------------|:---------------|:----------------------|:--------------------|:-----------------------------|:------------|:-------|:-------------|:-------------|:---------------|:---------------|:-----------|:-------------------|:--------|:--------------|:--------------------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/chen_hai_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T10:15:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T11:05:24+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of chen\_hai/鎮海/镇海 (Azur Lane)
======================================
This is the dataset of chen\_hai/鎮海/镇海 (Azur Lane), containing 132 images and their tags.
The core tags of this character are 'black\_hair, breasts, large\_breasts, hair\_ornament, long\_hair, bangs, purple\_eyes, red\_eyes, hair\_flower', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
796ef54f66b9ad258564e30dd8ce982b6b191141 | # Dataset Card for "vietnamese-retrieval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | thanhdath/vietnamese-retrieval | [
"region:us"
] | 2024-01-13T10:57:25+00:00 | {"dataset_info": {"features": [{"name": "query_id", "dtype": "string"}, {"name": "query", "dtype": "string"}, {"name": "positive_passages", "list": [{"name": "docid", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}]}, {"name": "negative_passages", "list": [{"name": "docid", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 2745183922, "num_examples": 273386}], "download_size": 927038024, "dataset_size": 2745183922}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T11:49:51+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "vietnamese-retrieval"
More Information needed | [
"# Dataset Card for \"vietnamese-retrieval\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"vietnamese-retrieval\"\n\nMore Information needed"
] |
1394e43c8fa12c4a5b8d017b22ac06b474d3c368 |
# Dataset of lutzow/リュッツォウ/吕佐夫 (Azur Lane)
This is the dataset of lutzow/リュッツォウ/吕佐夫 (Azur Lane), containing 68 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, hat, grey_hair, black_headwear, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 68 | 125.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 68 | 60.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 167 | 133.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 68 | 105.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 167 | 200.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lutzow_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_skirt, drill_locks, looking_at_viewer, solo, white_shirt, red_nails, braid, detached_sleeves, nail_polish, non-humanoid_robot, smile, thighhighs, belt, drill_hair, feet_out_of_frame, grey_eyes |
| 1 | 10 |  |  |  |  |  | 1girl, black_skirt, detached_sleeves, looking_at_viewer, solo, long_sleeves, white_shirt, bare_shoulders, black_footwear, mini_hat, open_mouth, black_thighhighs, thigh_boots, braid, one_eye_closed, simple_background, sitting, grey_eyes, high-waist_skirt, medium_hair, stuffed_toy, white_background, ;o, holding, nail_polish, red_nails, yawning |
| 2 | 21 |  |  |  |  |  | looking_at_viewer, red_eyes, 1girl, black_dress, cleavage, official_alternate_costume, solo, thighhighs, white_hair, bare_shoulders, blush, tongue_out, drill_locks, hair_ornament, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | drill_locks | looking_at_viewer | solo | white_shirt | red_nails | braid | detached_sleeves | nail_polish | non-humanoid_robot | smile | thighhighs | belt | drill_hair | feet_out_of_frame | grey_eyes | long_sleeves | bare_shoulders | black_footwear | mini_hat | open_mouth | black_thighhighs | thigh_boots | one_eye_closed | simple_background | sitting | high-waist_skirt | medium_hair | stuffed_toy | white_background | ;o | holding | yawning | red_eyes | black_dress | cleavage | official_alternate_costume | white_hair | blush | tongue_out | hair_ornament |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------|:--------------------|:-------|:--------------|:------------|:--------|:-------------------|:--------------|:---------------------|:--------|:-------------|:-------|:-------------|:--------------------|:------------|:---------------|:-----------------|:-----------------|:-----------|:-------------|:-------------------|:--------------|:-----------------|:--------------------|:----------|:-------------------|:--------------|:--------------|:-------------------|:-----|:----------|:----------|:-----------|:--------------|:-----------|:-----------------------------|:-------------|:--------|:-------------|:----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 21 |  |  |  |  |  | X | | X | X | X | | | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
| CyberHarem/lutzow_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T11:11:50+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T11:33:02+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of lutzow/リュッツォウ/吕佐夫 (Azur Lane)
========================================
This is the dataset of lutzow/リュッツォウ/吕佐夫 (Azur Lane), containing 68 images and their tags.
The core tags of this character are 'breasts, long\_hair, large\_breasts, hat, grey\_hair, black\_headwear, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b5f84f6a80bef19d1e75326cc4b444e3d41f1ce3 |
# Dataset of aurora/オーロラ/欧若拉 (Azur Lane)
This is the dataset of aurora/オーロラ/欧若拉 (Azur Lane), containing 90 images and their tags.
The core tags of this character are `blonde_hair, long_hair, green_eyes, breasts, bangs, large_breasts, very_long_hair, medium_breasts, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 90 | 194.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 90 | 85.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 222 | 181.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 90 | 160.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 222 | 293.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aurora_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, china_dress, smile, black_thighhighs, closed_mouth, simple_background, white_background, bare_shoulders, cleavage, full_body, high_heels, pelvic_curtain, standing, clothing_cutout, flower, folding_fan, garter_straps, holding_fan, side_slit, black_gloves, blue_dress, bridal_gauntlets, covered_navel, earrings, hair_ornament, low-tied_long_hair, panties, petals, red_dress, red_footwear |
| 1 | 17 |  |  |  |  |  | 1girl, blush, looking_at_viewer, looking_back, sweat, from_behind, smile, solo, thighs, backboob, huge_breasts, nude, veil, earrings, armlet, bracelet, curvy, thighhighs, huge_ass, on_stomach, sideboob |
| 2 | 6 |  |  |  |  |  | 1girl, blush, closed_mouth, huge_breasts, looking_at_viewer, nipples, smile, solo, sweat, thighs, veil, jewelry, nail_polish, pubic_tattoo, pussy, navel, nude, outdoors, armlet, collarbone, detached_sleeves, lips, night, piercing, see-through, stomach, thighhighs |
| 3 | 21 |  |  |  |  |  | 1girl, bare_shoulders, solo, blush, cleavage, detached_sleeves, long_sleeves, looking_at_viewer, smile, belt, pleated_skirt, black_skirt, closed_mouth, garter_straps, hair_ribbon, white_thighhighs, hair_ornament, petals, sitting, black_ribbon, bowtie, white_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | china_dress | smile | black_thighhighs | closed_mouth | simple_background | white_background | bare_shoulders | cleavage | full_body | high_heels | pelvic_curtain | standing | clothing_cutout | flower | folding_fan | garter_straps | holding_fan | side_slit | black_gloves | blue_dress | bridal_gauntlets | covered_navel | earrings | hair_ornament | low-tied_long_hair | panties | petals | red_dress | red_footwear | looking_back | sweat | from_behind | thighs | backboob | huge_breasts | nude | veil | armlet | bracelet | curvy | thighhighs | huge_ass | on_stomach | sideboob | nipples | jewelry | nail_polish | pubic_tattoo | pussy | navel | outdoors | collarbone | detached_sleeves | lips | night | piercing | see-through | stomach | long_sleeves | belt | pleated_skirt | black_skirt | hair_ribbon | white_thighhighs | sitting | black_ribbon | bowtie | white_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:--------------|:--------|:-------------------|:---------------|:--------------------|:-------------------|:-----------------|:-----------|:------------|:-------------|:-----------------|:-----------|:------------------|:---------|:--------------|:----------------|:--------------|:------------|:---------------|:-------------|:-------------------|:----------------|:-----------|:----------------|:---------------------|:----------|:---------|:------------|:---------------|:---------------|:--------|:--------------|:---------|:-----------|:---------------|:-------|:-------|:---------|:-----------|:--------|:-------------|:-----------|:-------------|:-----------|:----------|:----------|:--------------|:---------------|:--------|:--------|:-----------|:-------------|:-------------------|:-------|:--------|:-----------|:--------------|:----------|:---------------|:-------|:----------------|:--------------|:--------------|:-------------------|:----------|:---------------|:---------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | X | X | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 3 | 21 |  |  |  |  |  | X | X | X | X | | X | | X | | | X | X | | | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/aurora_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T11:11:54+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T11:55:08+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of aurora/オーロラ/欧若拉 (Azur Lane)
======================================
This is the dataset of aurora/オーロラ/欧若拉 (Azur Lane), containing 90 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, green\_eyes, breasts, bangs, large\_breasts, very\_long\_hair, medium\_breasts, ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
fc772d3fc233d89a4ca29b0dffbfdc634bb993b1 |
# Dataset of aquila/アクィラ/天鹰 (Azur Lane)
This is the dataset of aquila/アクィラ/天鹰 (Azur Lane), containing 125 images and their tags.
The core tags of this character are `breasts, long_hair, green_eyes, large_breasts, grey_hair, very_long_hair, hat, white_headwear, braid, hair_between_eyes, sun_hat, bangs, single_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 222.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aquila_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 125 | 109.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aquila_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 326 | 246.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aquila_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 125 | 190.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aquila_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 326 | 378.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aquila_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aquila_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | black_bikini, official_alternate_costume, 1girl, looking_at_viewer, solo, cleavage, highleg_bikini, navel, white_choker, outdoors, smile, bare_shoulders, blush, necklace, thigh_strap, ribbon, blue_sky, day, open_mouth, standing, water |
| 1 | 7 |  |  |  |  |  | 1girl, black_headwear, looking_at_viewer, official_alternate_costume, solo, white_dress, cleavage, blush, smile, covered_navel, jewelry, sleeveless_dress, thighhighs |
| 2 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, penis, open_mouth, sex, solo_focus, vaginal, black_bikini, navel, official_alternate_costume, pussy, bikini_bottom_aside, looking_at_viewer, mosaic_censoring, cum, jewelry, spread_legs, thigh_strap |
| 3 | 5 |  |  |  |  |  | 1girl, black_gloves, solo, dress, full_body, high_heels, holding_weapon, looking_at_viewer, thigh_strap, white_thighhighs, absurdly_long_hair, black_headwear, simple_background, white_background, white_capelet, white_hair, aiguillette, blush, holding_cane, long_sleeves, low_twintails, medium_breasts, rigging, sideboob, standing, white_ascot |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | black_bikini | official_alternate_costume | 1girl | looking_at_viewer | solo | cleavage | highleg_bikini | navel | white_choker | outdoors | smile | bare_shoulders | blush | necklace | thigh_strap | ribbon | blue_sky | day | open_mouth | standing | water | black_headwear | white_dress | covered_navel | jewelry | sleeveless_dress | thighhighs | 1boy | hetero | nipples | penis | sex | solo_focus | vaginal | pussy | bikini_bottom_aside | mosaic_censoring | cum | spread_legs | black_gloves | dress | full_body | high_heels | holding_weapon | white_thighhighs | absurdly_long_hair | simple_background | white_background | white_capelet | white_hair | aiguillette | holding_cane | long_sleeves | low_twintails | medium_breasts | rigging | sideboob | white_ascot |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:-----------------------------|:--------|:--------------------|:-------|:-----------|:-----------------|:--------|:---------------|:-----------|:--------|:-----------------|:--------|:-----------|:--------------|:---------|:-----------|:------|:-------------|:-----------|:--------|:-----------------|:--------------|:----------------|:----------|:-------------------|:-------------|:-------|:---------|:----------|:--------|:------|:-------------|:----------|:--------|:----------------------|:-------------------|:------|:--------------|:---------------|:--------|:------------|:-------------|:-----------------|:-------------------|:---------------------|:--------------------|:-------------------|:----------------|:-------------|:--------------|:---------------|:---------------|:----------------|:-----------------|:----------|:-----------|:--------------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | | X | X | X | X | X | | | | | X | | X | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | | | | X | | | | | X | | X | | | | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | | | X | X | X | | | | | | | | X | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/aquila_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T11:12:45+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T11:45:13+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of aquila/アクィラ/天鹰 (Azur Lane)
=====================================
This is the dataset of aquila/アクィラ/天鹰 (Azur Lane), containing 125 images and their tags.
The core tags of this character are 'breasts, long\_hair, green\_eyes, large\_breasts, grey\_hair, very\_long\_hair, hat, white\_headwear, braid, hair\_between\_eyes, sun\_hat, bangs, single\_braid', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2408779e19fa3a6135eed6ac78eb569ecdf03285 | First >15k Imperative Question generated with Mixtral 4 bit | SebastianBodza/wikipedia-22-12-de-dpr | [
"region:us"
] | 2024-01-13T11:14:30+00:00 | {} | 2024-01-15T09:17:29+00:00 | [] | [] | TAGS
#region-us
| First >15k Imperative Question generated with Mixtral 4 bit | [] | [
"TAGS\n#region-us \n"
] |
eaf3c52a68c36bb8c55c5bb9db24564e003f2341 |
this dataset is the parquet version of the dataset that was created by [mio](https://huggingface.co/mio/)
original dataset link : https://huggingface.co/datasets/mio/sukasuka-anime-vocal-dataset
please make sure to follow and heart react the original author (≧∇≦)ノ
| lowres/sukasuka-anime-vocal-dataset | [
"task_categories:audio-classification",
"size_categories:1K<n<10K",
"language:ja",
"license:other",
"region:us"
] | 2024-01-13T11:58:57+00:00 | {"language": ["ja"], "license": "other", "size_categories": ["1K<n<10K"], "task_categories": ["audio-classification"], "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Almaria", "1": "Almita", "2": "Buronny", "3": "Chtholly", "4": "Collon", "5": "EbonCandle", "6": "Elq", "7": "Godley", "8": "Grick", "9": "Ithea", "10": "Lakhesh", "11": "Lillia", "12": "Limeskin", "13": "Margomedari", "14": "Narration", "15": "Nephren", "16": "Nopht", "17": "Nygglatho", "18": "Pannibal", "19": "Phyracorlybia", "20": "Rhantolk", "21": "SilverClover", "22": "Suowong", "23": "SuowongYoung", "24": "Tiat", "25": "Willem"}}}}], "splits": [{"name": "train", "num_bytes": 1528660644, "num_examples": 3495}], "download_size": 1465797251, "dataset_size": 1528660644}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T12:11:17+00:00 | [] | [
"ja"
] | TAGS
#task_categories-audio-classification #size_categories-1K<n<10K #language-Japanese #license-other #region-us
|
this dataset is the parquet version of the dataset that was created by mio
original dataset link : URL
please make sure to follow and heart react the original author (≧∇≦)ノ
| [] | [
"TAGS\n#task_categories-audio-classification #size_categories-1K<n<10K #language-Japanese #license-other #region-us \n"
] |
9ef95efaf40802c43f4399b34b8d5e4817a88a50 | Small dataset containing sentences from the same text. The source column was OCRed with tesseract 5.0, the target column was extracted from epub files.
I have not applied human annotation. | GaborMadarasz/ocr_silver | [
"task_categories:translation",
"size_categories:10K<n<100K",
"language:hu",
"license:apache-2.0",
"region:us"
] | 2024-01-13T12:13:28+00:00 | {"language": ["hu"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["translation"]} | 2024-01-14T14:50:02+00:00 | [] | [
"hu"
] | TAGS
#task_categories-translation #size_categories-10K<n<100K #language-Hungarian #license-apache-2.0 #region-us
| Small dataset containing sentences from the same text. The source column was OCRed with tesseract 5.0, the target column was extracted from epub files.
I have not applied human annotation. | [] | [
"TAGS\n#task_categories-translation #size_categories-10K<n<100K #language-Hungarian #license-apache-2.0 #region-us \n"
] |
be96bcbf70db7d829fa1022ad963192a983279cb | A set of NER-related questions about multimessenger astronomy. | vvsotnikov/mm-astronomy | [
"task_categories:question-answering",
"task_ids:open-domain-qa",
"task_ids:multiple-choice-qa",
"size_categories:n<1K",
"language:en",
"license:mit",
"logical reasoning",
"reading comprehension",
"common sense",
"astrophysics",
"region:us"
] | 2024-01-13T12:42:50+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["n<1K"], "task_categories": ["question-answering"], "task_ids": ["open-domain-qa", "multiple-choice-qa"], "tags": ["logical reasoning", "reading comprehension", "common sense", "astrophysics"], "metrics": ["multiple_choice_grade"], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "train", "num_examples": 190}, {"name": "validation", "num_examples": 568}]}} | 2024-02-14T06:26:06+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_ids-open-domain-qa #task_ids-multiple-choice-qa #size_categories-n<1K #language-English #license-mit #logical reasoning #reading comprehension #common sense #astrophysics #region-us
| A set of NER-related questions about multimessenger astronomy. | [] | [
"TAGS\n#task_categories-question-answering #task_ids-open-domain-qa #task_ids-multiple-choice-qa #size_categories-n<1K #language-English #license-mit #logical reasoning #reading comprehension #common sense #astrophysics #region-us \n"
] |
79475550d3a931a0377d44c68da0e7e2df8eef52 |
# Dataset of marblehead/マーブルヘッド/马布尔黑德 (Azur Lane)
This is the dataset of marblehead/マーブルヘッド/马布尔黑德 (Azur Lane), containing 74 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, breasts, hair_ornament, multicolored_hair, large_breasts, hairclip, pink_hair, two-tone_hair, hair_between_eyes, bangs, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 74 | 97.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 74 | 59.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 177 | 122.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 74 | 86.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 177 | 171.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/marblehead_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, smile, black_pants, blush, midriff, navel, parted_lips, short_hair_with_long_locks, sleeveless, cleavage_cutout, gyaru, long_hair, standing, sweat, crop_top, sports_bra, bare_arms, black_shirt, purple_hair, simple_background, symbol-shaped_pupils, white_background |
| 1 | 9 |  |  |  |  |  | 1girl, breast_tattoo, gyaru, looking_at_viewer, navel_piercing, short_hair_with_long_locks, short_shorts, smile, solo, thighhighs, cleavage, id_card, black_shorts, leotard_under_clothes, simple_background, white_background, full_body, white_coat, sleeves_past_fingers |
| 2 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, bare_shoulders, garter_straps, smile, solo, black_thighhighs, blush, ass, off-shoulder_sweater, short_hair_with_long_locks, cleavage, official_alternate_costume, standing, sweater_dress, christmas, from_behind, gift, long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | bare_shoulders | smile | black_pants | blush | midriff | navel | parted_lips | short_hair_with_long_locks | sleeveless | cleavage_cutout | gyaru | long_hair | standing | sweat | crop_top | sports_bra | bare_arms | black_shirt | purple_hair | simple_background | symbol-shaped_pupils | white_background | breast_tattoo | navel_piercing | short_shorts | thighhighs | cleavage | id_card | black_shorts | leotard_under_clothes | full_body | white_coat | sleeves_past_fingers | garter_straps | black_thighhighs | ass | off-shoulder_sweater | official_alternate_costume | sweater_dress | christmas | from_behind | gift |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------|:--------|:--------------|:--------|:----------|:--------|:--------------|:-----------------------------|:-------------|:------------------|:--------|:------------|:-----------|:--------|:-----------|:-------------|:------------|:--------------|:--------------|:--------------------|:-----------------------|:-------------------|:----------------|:-----------------|:---------------|:-------------|:-----------|:----------|:---------------|:------------------------|:------------|:-------------|:-----------------------|:----------------|:-------------------|:------|:-----------------------|:-----------------------------|:----------------|:------------|:--------------|:-------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | | X | | | | | | X | | | X | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | X | X | X | | X | | | | X | | | | X | X | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/marblehead_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T13:15:15+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T13:31:29+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of marblehead/マーブルヘッド/马布尔黑德 (Azur Lane)
===============================================
This is the dataset of marblehead/マーブルヘッド/马布尔黑德 (Azur Lane), containing 74 images and their tags.
The core tags of this character are 'blonde\_hair, blue\_eyes, breasts, hair\_ornament, multicolored\_hair, large\_breasts, hairclip, pink\_hair, two-tone\_hair, hair\_between\_eyes, bangs, sidelocks', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
e5e3a4259a4a3d7c767117afa8ccb9994b76ae55 | # Understanding Tagalog Questions and tasks (~2023)
"Raws".
UTaQk is a small (informal) collection of questions and writing tasks meant to test the capability of language models, but in Tagalog.
The questions are loosely categorized into:
* 0 - Puzzles/riddles
* 1 - Languages/translation
* 2 - Writing tasks
* 3 - Creative writing
* 4 - Facts
* 5 - Coding
* 6 - Logic
Feel free to add/expand and/or correct any mistranslations.
WIP
| 922-Narra/UTaQk | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T13:19:02+00:00 | {"license": "apache-2.0"} | 2024-01-13T14:02:29+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| # Understanding Tagalog Questions and tasks (~2023)
"Raws".
UTaQk is a small (informal) collection of questions and writing tasks meant to test the capability of language models, but in Tagalog.
The questions are loosely categorized into:
* 0 - Puzzles/riddles
* 1 - Languages/translation
* 2 - Writing tasks
* 3 - Creative writing
* 4 - Facts
* 5 - Coding
* 6 - Logic
Feel free to add/expand and/or correct any mistranslations.
WIP
| [
"# Understanding Tagalog Questions and tasks (~2023)\n\"Raws\".\n\nUTaQk is a small (informal) collection of questions and writing tasks meant to test the capability of language models, but in Tagalog.\n\nThe questions are loosely categorized into:\n* 0 - Puzzles/riddles\n* 1 - Languages/translation\n* 2 - Writing tasks\n* 3 - Creative writing\n* 4 - Facts\n* 5 - Coding\n* 6 - Logic\n\nFeel free to add/expand and/or correct any mistranslations.\n\nWIP"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Understanding Tagalog Questions and tasks (~2023)\n\"Raws\".\n\nUTaQk is a small (informal) collection of questions and writing tasks meant to test the capability of language models, but in Tagalog.\n\nThe questions are loosely categorized into:\n* 0 - Puzzles/riddles\n* 1 - Languages/translation\n* 2 - Writing tasks\n* 3 - Creative writing\n* 4 - Facts\n* 5 - Coding\n* 6 - Logic\n\nFeel free to add/expand and/or correct any mistranslations.\n\nWIP"
] |
d53be2fb3b2fb8fbb0711cea63350b08d6d62ad1 |
# Dataset of golden_hind/ゴールデン・ハインド/金鹿号 (Azur Lane)
This is the dataset of golden_hind/ゴールデン・ハインド/金鹿号 (Azur Lane), containing 68 images and their tags.
The core tags of this character are `breasts, long_hair, horns, black_hair, large_breasts, blue_eyes, bangs, very_long_hair, mole, mole_under_mouth`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 68 | 158.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/golden_hind_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 68 | 72.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/golden_hind_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 180 | 157.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/golden_hind_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 68 | 129.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/golden_hind_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 180 | 251.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/golden_hind_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/golden_hind_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, looking_at_viewer, solo, tentacles, blush, navel, tongue_out, cleavage, open_mouth, smile, dress, armpits, bare_shoulders, chain, nail_polish, revealing_clothes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | tentacles | blush | navel | tongue_out | cleavage | open_mouth | smile | dress | armpits | bare_shoulders | chain | nail_polish | revealing_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:------------|:--------|:--------|:-------------|:-----------|:-------------|:--------|:--------|:----------|:-----------------|:--------|:--------------|:--------------------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/golden_hind_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T13:42:12+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T14:03:24+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of golden\_hind/ゴールデン・ハインド/金鹿号 (Azur Lane)
==================================================
This is the dataset of golden\_hind/ゴールデン・ハインド/金鹿号 (Azur Lane), containing 68 images and their tags.
The core tags of this character are 'breasts, long\_hair, horns, black\_hair, large\_breasts, blue\_eyes, bangs, very\_long\_hair, mole, mole\_under\_mouth', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a7d29a45b994ea4bfcedc31891853b1a11f70b46 |
# Dataset of u_556/U-556 (Azur Lane)
This is the dataset of u_556/U-556 (Azur Lane), containing 37 images and their tags.
The core tags of this character are `bangs, blue_hair, twintails, red_eyes, blunt_bangs, short_hair, breasts, short_twintails, multicolored_hair, long_hair, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 37 | 39.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_556_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 37 | 24.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_556_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 90 | 52.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_556_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 37 | 35.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_556_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 90 | 69.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_556_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/u_556_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | long_sleeves, open_mouth, 1girl, white_bikini, :d, solo, navel, upper_teeth_only, blush, open_jacket, red_gloves, small_breasts, socks, iron_cross, holding, looking_at_viewer, rudder_footwear, air_bubble, red_footwear, underboob |
| 1 | 6 |  |  |  |  |  | white_dress, 1girl, black_gloves, blush, looking_at_viewer, open_mouth, solo, :d, black_bow, pink_hair, two-tone_hair, upper_teeth_only, white_capelet, blue_flower, fur_trim, hair_flower, holding, purple_hair, short_sleeves, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | long_sleeves | open_mouth | 1girl | white_bikini | :d | solo | navel | upper_teeth_only | blush | open_jacket | red_gloves | small_breasts | socks | iron_cross | holding | looking_at_viewer | rudder_footwear | air_bubble | red_footwear | underboob | white_dress | black_gloves | black_bow | pink_hair | two-tone_hair | white_capelet | blue_flower | fur_trim | hair_flower | purple_hair | short_sleeves | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:-------------|:--------|:---------------|:-----|:-------|:--------|:-------------------|:--------|:--------------|:-------------|:----------------|:--------|:-------------|:----------|:--------------------|:------------------|:-------------|:---------------|:------------|:--------------|:---------------|:------------|:------------|:----------------|:----------------|:--------------|:-----------|:--------------|:--------------|:----------------|:--------------------|:-------------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | | X | X | | X | X | | X | X | | | | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/u_556_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T13:42:12+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T13:52:43+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of u\_556/U-556 (Azur Lane)
===================================
This is the dataset of u\_556/U-556 (Azur Lane), containing 37 images and their tags.
The core tags of this character are 'bangs, blue\_hair, twintails, red\_eyes, blunt\_bangs, short\_hair, breasts, short\_twintails, multicolored\_hair, long\_hair, sidelocks', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
fe752b216195fc9a9802456f66f05a9080053f6d |
# Dataset Card for "ai2_arc" translated into Hindi
This is Hindi translated version of "ai2_arc" using the IndicTrans2 model ([Gala et al., 2023](https://openreview.net/forum?id=vfT4YuzAYA)).
We recommend you to visit the "ai2_arc" huggingface dataset card ([link](https://huggingface.co/datasets/allenai/ai2_arc)) for the details.
| ai4bharat/ai2_arc-hi | [
"task_categories:question-answering",
"task_ids:open-domain-qa",
"task_ids:multiple-choice-qa",
"annotations_creators:found",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"source_datasets:original",
"language:hi",
"license:cc-by-sa-4.0",
"region:us"
] | 2024-01-13T14:05:15+00:00 | {"annotations_creators": ["found"], "language_creators": ["found"], "language": ["hi"], "license": ["cc-by-sa-4.0"], "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "source_datasets": ["original"], "task_categories": ["question-answering"], "task_ids": ["open-domain-qa", "multiple-choice-qa"], "pretty_name": "Ai2Arc", "language_bcp47": ["en-US"], "dataset_info": [{"config_name": "ARC-Challenge", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "struct": [{"name": "text", "sequence": "string"}, {"name": "label", "sequence": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 375511, "num_examples": 1172}, {"name": "validation", "num_bytes": 96660, "num_examples": 299}], "download_size": 449460, "dataset_size": 821931}, {"config_name": "ARC-Easy", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "struct": [{"name": "text", "sequence": "string"}, {"name": "label", "sequence": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 657514, "num_examples": 2376}, {"name": "validation", "num_bytes": 157394, "num_examples": 570}], "download_size": 762935, "dataset_size": 1433908}], "configs": [{"config_name": "ARC-Challenge", "data_files": [{"split": "test", "path": "ARC-Challenge/test-*"}, {"split": "validation", "path": "ARC-Challenge/validation-*"}]}, {"config_name": "ARC-Easy", "data_files": [{"split": "test", "path": "ARC-Easy/test-*"}, {"split": "validation", "path": "ARC-Easy/validation-*"}]}]} | 2024-01-23T11:48:56+00:00 | [] | [
"hi"
] | TAGS
#task_categories-question-answering #task_ids-open-domain-qa #task_ids-multiple-choice-qa #annotations_creators-found #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Hindi #license-cc-by-sa-4.0 #region-us
|
# Dataset Card for "ai2_arc" translated into Hindi
This is Hindi translated version of "ai2_arc" using the IndicTrans2 model (Gala et al., 2023).
We recommend you to visit the "ai2_arc" huggingface dataset card (link) for the details.
| [
"# Dataset Card for \"ai2_arc\" translated into Hindi\n\nThis is Hindi translated version of \"ai2_arc\" using the IndicTrans2 model (Gala et al., 2023).\n\nWe recommend you to visit the \"ai2_arc\" huggingface dataset card (link) for the details."
] | [
"TAGS\n#task_categories-question-answering #task_ids-open-domain-qa #task_ids-multiple-choice-qa #annotations_creators-found #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Hindi #license-cc-by-sa-4.0 #region-us \n",
"# Dataset Card for \"ai2_arc\" translated into Hindi\n\nThis is Hindi translated version of \"ai2_arc\" using the IndicTrans2 model (Gala et al., 2023).\n\nWe recommend you to visit the \"ai2_arc\" huggingface dataset card (link) for the details."
] |
cc06da72f1e31aaaf10c67c4b76b1f138278e8dc | # Dataset Card for "12-01-2024-last-2000-row-QA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | CaoHaiNam/12-01-2024-last-2000-row-QA | [
"region:us"
] | 2024-01-13T14:25:01+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "question", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1129802, "num_examples": 633}], "download_size": 535918, "dataset_size": 1129802}} | 2024-01-13T14:25:06+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "12-01-2024-last-2000-row-QA"
More Information needed | [
"# Dataset Card for \"12-01-2024-last-2000-row-QA\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"12-01-2024-last-2000-row-QA\"\n\nMore Information needed"
] |
bea092568fa4790d92895196a443c2cc7787ddc2 | # Dataset Card for "12-01-2024-last-2000-row-QA-segmentation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | CaoHaiNam/12-01-2024-last-2000-row-QA-segmentation | [
"region:us"
] | 2024-01-13T14:54:27+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "question", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1148467, "num_examples": 633}], "download_size": 540709, "dataset_size": 1148467}} | 2024-01-13T14:54:33+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "12-01-2024-last-2000-row-QA-segmentation"
More Information needed | [
"# Dataset Card for \"12-01-2024-last-2000-row-QA-segmentation\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"12-01-2024-last-2000-row-QA-segmentation\"\n\nMore Information needed"
] |
3b856aead91a6ea34d5b00b183bb286377f8f420 |
# Dataset Card for Evaluation run of tuantran1632001/Psyfighter2-Orca2-13B-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tuantran1632001/Psyfighter2-Orca2-13B-ties](https://huggingface.co/tuantran1632001/Psyfighter2-Orca2-13B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:05:15.491956](https://huggingface.co/datasets/open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties/blob/main/results_2024-01-13T15-05-15.491956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.603383654987928,
"acc_stderr": 0.03303267584618269,
"acc_norm": 0.607142680047232,
"acc_norm_stderr": 0.033700954867739115,
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.5540489547722205,
"mc2_stderr": 0.01582448369078134
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.01415063143511173
},
"harness|hellaswag|10": {
"acc": 0.6296554471220872,
"acc_stderr": 0.00481910045686781,
"acc_norm": 0.8173670583549094,
"acc_norm_stderr": 0.0038557568514415437
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278008,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278008
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562417,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462826,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723875,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878948,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878948
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.016970289090458026,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.016970289090458026
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290923,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290923
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128136,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.014583812465862557,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.014583812465862557
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3452513966480447,
"acc_stderr": 0.015901432608930354,
"acc_norm": 0.3452513966480447,
"acc_norm_stderr": 0.015901432608930354
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.027363593284684965,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.027363593284684965
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.02584224870090217,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.02584224870090217
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4322033898305085,
"acc_stderr": 0.012652297777114968,
"acc_norm": 0.4322033898305085,
"acc_norm_stderr": 0.012652297777114968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.019524316744866356,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.019524316744866356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106567,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106567
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.5540489547722205,
"mc2_stderr": 0.01582448369078134
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091088
},
"harness|gsm8k|5": {
"acc": 0.43669446550416985,
"acc_stderr": 0.013661649780905488
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties | [
"region:us"
] | 2024-01-13T15:07:34+00:00 | {"pretty_name": "Evaluation run of tuantran1632001/Psyfighter2-Orca2-13B-ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [tuantran1632001/Psyfighter2-Orca2-13B-ties](https://huggingface.co/tuantran1632001/Psyfighter2-Orca2-13B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:05:15.491956](https://huggingface.co/datasets/open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties/blob/main/results_2024-01-13T15-05-15.491956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.603383654987928,\n \"acc_stderr\": 0.03303267584618269,\n \"acc_norm\": 0.607142680047232,\n \"acc_norm_stderr\": 0.033700954867739115,\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.5540489547722205,\n \"mc2_stderr\": 0.01582448369078134\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.01415063143511173\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6296554471220872,\n \"acc_stderr\": 0.00481910045686781,\n \"acc_norm\": 0.8173670583549094,\n \"acc_norm_stderr\": 0.0038557568514415437\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.04372748290278008,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.04372748290278008\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462826,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878948,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878948\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.016970289090458026,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.016970289090458026\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290923,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290923\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n \"acc_stderr\": 0.014583812465862557,\n \"acc_norm\": 0.789272030651341,\n \"acc_norm_stderr\": 0.014583812465862557\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n \"acc_stderr\": 0.015901432608930354,\n \"acc_norm\": 0.3452513966480447,\n \"acc_norm_stderr\": 0.015901432608930354\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684965,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684965\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4322033898305085,\n \"acc_stderr\": 0.012652297777114968,\n \"acc_norm\": 0.4322033898305085,\n \"acc_norm_stderr\": 0.012652297777114968\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866356,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866356\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106567,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106567\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.5540489547722205,\n \"mc2_stderr\": 0.01582448369078134\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091088\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43669446550416985,\n \"acc_stderr\": 0.013661649780905488\n }\n}\n```", "repo_url": "https://huggingface.co/tuantran1632001/Psyfighter2-Orca2-13B-ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-15.491956.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["**/details_harness|winogrande|5_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-05-15.491956.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_05_15.491956", "path": ["results_2024-01-13T15-05-15.491956.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-05-15.491956.parquet"]}]}]} | 2024-01-13T15:07:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of tuantran1632001/Psyfighter2-Orca2-13B-ties
Dataset automatically created during the evaluation run of model tuantran1632001/Psyfighter2-Orca2-13B-ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:05:15.491956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of tuantran1632001/Psyfighter2-Orca2-13B-ties\n\n\n\nDataset automatically created during the evaluation run of model tuantran1632001/Psyfighter2-Orca2-13B-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:05:15.491956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tuantran1632001/Psyfighter2-Orca2-13B-ties\n\n\n\nDataset automatically created during the evaluation run of model tuantran1632001/Psyfighter2-Orca2-13B-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:05:15.491956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
985b15164535f29af6dbbbdcdd41ccbf2e02c3cd |
# Dataset Card for Evaluation run of damerajee/Oot-v2_lll
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [damerajee/Oot-v2_lll](https://huggingface.co/damerajee/Oot-v2_lll) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_damerajee__Oot-v2_lll",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:05:46.112716](https://huggingface.co/datasets/open-llm-leaderboard/details_damerajee__Oot-v2_lll/blob/main/results_2024-01-13T15-05-46.112716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6541530142901238,
"acc_stderr": 0.03197706952839702,
"acc_norm": 0.6540116792008602,
"acc_norm_stderr": 0.03263780198638047,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6256716337528857,
"mc2_stderr": 0.01513290351648502
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441372,
"acc_norm": 0.6928327645051194,
"acc_norm_stderr": 0.013481034054980941
},
"harness|hellaswag|10": {
"acc": 0.6799442342162916,
"acc_stderr": 0.004655442766599467,
"acc_norm": 0.8659629555865366,
"acc_norm_stderr": 0.003399958334372064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.015014462497168589,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.015014462497168589
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.4,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6256716337528857,
"mc2_stderr": 0.01513290351648502
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.7217589082638363,
"acc_stderr": 0.012343803671422677
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_damerajee__Oot-v2_lll | [
"region:us"
] | 2024-01-13T15:08:07+00:00 | {"pretty_name": "Evaluation run of damerajee/Oot-v2_lll", "dataset_summary": "Dataset automatically created during the evaluation run of model [damerajee/Oot-v2_lll](https://huggingface.co/damerajee/Oot-v2_lll) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_damerajee__Oot-v2_lll\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:05:46.112716](https://huggingface.co/datasets/open-llm-leaderboard/details_damerajee__Oot-v2_lll/blob/main/results_2024-01-13T15-05-46.112716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6541530142901238,\n \"acc_stderr\": 0.03197706952839702,\n \"acc_norm\": 0.6540116792008602,\n \"acc_norm_stderr\": 0.03263780198638047,\n \"mc1\": 0.46266829865361075,\n \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6256716337528857,\n \"mc2_stderr\": 0.01513290351648502\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441372,\n \"acc_norm\": 0.6928327645051194,\n \"acc_norm_stderr\": 0.013481034054980941\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6799442342162916,\n \"acc_stderr\": 0.004655442766599467,\n \"acc_norm\": 0.8659629555865366,\n \"acc_norm_stderr\": 0.003399958334372064\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8568807339449541,\n \"acc_stderr\": 0.015014462497168589,\n \"acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.015014462497168589\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6256716337528857,\n \"mc2_stderr\": 0.01513290351648502\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7217589082638363,\n \"acc_stderr\": 0.012343803671422677\n }\n}\n```", "repo_url": "https://huggingface.co/damerajee/Oot-v2_lll", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-46.112716.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["**/details_harness|winogrande|5_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-05-46.112716.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_05_46.112716", "path": ["results_2024-01-13T15-05-46.112716.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-05-46.112716.parquet"]}]}]} | 2024-01-13T15:08:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of damerajee/Oot-v2_lll
Dataset automatically created during the evaluation run of model damerajee/Oot-v2_lll on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:05:46.112716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of damerajee/Oot-v2_lll\n\n\n\nDataset automatically created during the evaluation run of model damerajee/Oot-v2_lll on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:05:46.112716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of damerajee/Oot-v2_lll\n\n\n\nDataset automatically created during the evaluation run of model damerajee/Oot-v2_lll on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:05:46.112716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
785a5737ccd5175ea6d998d574317a2c079d1e49 |
# Dataset Card for Evaluation run of timpal0l/Mistral-7B-v0.1-flashback-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [timpal0l/Mistral-7B-v0.1-flashback-v2](https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:10:27.393635](https://huggingface.co/datasets/open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2/blob/main/results_2024-01-13T15-10-27.393635.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5963604853429504,
"acc_stderr": 0.03312318488664919,
"acc_norm": 0.6028320780728574,
"acc_norm_stderr": 0.03381496123659357,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326916,
"mc2": 0.40658215292594935,
"mc2_stderr": 0.014101721545122618
},
"harness|arc:challenge|25": {
"acc": 0.523037542662116,
"acc_stderr": 0.014595873205358273,
"acc_norm": 0.5716723549488054,
"acc_norm_stderr": 0.014460496367599017
},
"harness|hellaswag|10": {
"acc": 0.6008763194582752,
"acc_stderr": 0.004887174080003034,
"acc_norm": 0.8074088826926907,
"acc_norm_stderr": 0.003935286940315854
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.029514703583981765,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.029514703583981765
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159784,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029268,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029268
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713549,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713549
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164542,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940798,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940798
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705049,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705049
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.01498727064094601,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.01498727064094601
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.015382845587584524,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.015382845587584524
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963046,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963046
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4322033898305085,
"acc_stderr": 0.012652297777114968,
"acc_norm": 0.4322033898305085,
"acc_norm_stderr": 0.012652297777114968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.01955964680921593,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.01955964680921593
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326916,
"mc2": 0.40658215292594935,
"mc2_stderr": 0.014101721545122618
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663606
},
"harness|gsm8k|5": {
"acc": 0.2941622441243366,
"acc_stderr": 0.012551285331470156
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2 | [
"region:us"
] | 2024-01-13T15:12:43+00:00 | {"pretty_name": "Evaluation run of timpal0l/Mistral-7B-v0.1-flashback-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [timpal0l/Mistral-7B-v0.1-flashback-v2](https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:10:27.393635](https://huggingface.co/datasets/open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2/blob/main/results_2024-01-13T15-10-27.393635.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5963604853429504,\n \"acc_stderr\": 0.03312318488664919,\n \"acc_norm\": 0.6028320780728574,\n \"acc_norm_stderr\": 0.03381496123659357,\n \"mc1\": 0.2766217870257038,\n \"mc1_stderr\": 0.015659605755326916,\n \"mc2\": 0.40658215292594935,\n \"mc2_stderr\": 0.014101721545122618\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.523037542662116,\n \"acc_stderr\": 0.014595873205358273,\n \"acc_norm\": 0.5716723549488054,\n \"acc_norm_stderr\": 0.014460496367599017\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6008763194582752,\n \"acc_stderr\": 0.004887174080003034,\n \"acc_norm\": 0.8074088826926907,\n \"acc_norm_stderr\": 0.003935286940315854\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.5407407407407407,\n \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981765,\n \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981765\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.041307408795554966,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.041307408795554966\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159784,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159784\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n \"acc_stderr\": 0.025649381063029268,\n \"acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.025649381063029268\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713549,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713549\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164542,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164542\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940798,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940798\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705049,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705049\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n \"acc_stderr\": 0.01498727064094601,\n \"acc_norm\": 0.7726692209450831,\n \"acc_norm_stderr\": 0.01498727064094601\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n \"acc_stderr\": 0.015382845587584524,\n \"acc_norm\": 0.3039106145251397,\n \"acc_norm_stderr\": 0.015382845587584524\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963046,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963046\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4322033898305085,\n \"acc_stderr\": 0.012652297777114968,\n \"acc_norm\": 0.4322033898305085,\n \"acc_norm_stderr\": 0.012652297777114968\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.01955964680921593,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.01955964680921593\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n \"mc1_stderr\": 0.015659605755326916,\n \"mc2\": 0.40658215292594935,\n \"mc2_stderr\": 0.014101721545122618\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663606\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2941622441243366,\n \"acc_stderr\": 0.012551285331470156\n }\n}\n```", "repo_url": "https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-10-27.393635.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["**/details_harness|winogrande|5_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-10-27.393635.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_10_27.393635", "path": ["results_2024-01-13T15-10-27.393635.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-10-27.393635.parquet"]}]}]} | 2024-01-13T15:13:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of timpal0l/Mistral-7B-v0.1-flashback-v2
Dataset automatically created during the evaluation run of model timpal0l/Mistral-7B-v0.1-flashback-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:10:27.393635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of timpal0l/Mistral-7B-v0.1-flashback-v2\n\n\n\nDataset automatically created during the evaluation run of model timpal0l/Mistral-7B-v0.1-flashback-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:10:27.393635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of timpal0l/Mistral-7B-v0.1-flashback-v2\n\n\n\nDataset automatically created during the evaluation run of model timpal0l/Mistral-7B-v0.1-flashback-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:10:27.393635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3d2782bf4dc9ced6ed2946e494c377e8170919f5 |
# Dataset Card for Evaluation run of argilla/distilabeled-Marcoro14-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [argilla/distilabeled-Marcoro14-7B-slerp](https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:19:50.264268](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp/blob/main/results_2024-01-13T15-19-50.264268.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6570721295894518,
"acc_stderr": 0.03197567245123644,
"acc_norm": 0.6568944940985069,
"acc_norm_stderr": 0.032637536590426994,
"mc1": 0.49326805385556916,
"mc1_stderr": 0.017501914492655382,
"mc2": 0.6510264969421982,
"mc2_stderr": 0.01511928866079861
},
"harness|arc:challenge|25": {
"acc": 0.6774744027303754,
"acc_stderr": 0.013659980894277366,
"acc_norm": 0.7073378839590444,
"acc_norm_stderr": 0.013295916103619425
},
"harness|hellaswag|10": {
"acc": 0.6964748058155746,
"acc_stderr": 0.0045884034194496656,
"acc_norm": 0.874726150169289,
"acc_norm_stderr": 0.003303526413123496
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.0399926287661772,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.0399926287661772
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179585,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179585
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008557,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008557
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101008,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528183,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528183
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49326805385556916,
"mc1_stderr": 0.017501914492655382,
"mc2": 0.6510264969421982,
"mc2_stderr": 0.01511928866079861
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047987
},
"harness|gsm8k|5": {
"acc": 0.7119029567854435,
"acc_stderr": 0.012474469737197916
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp | [
"region:us"
] | 2024-01-13T15:22:08+00:00 | {"pretty_name": "Evaluation run of argilla/distilabeled-Marcoro14-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [argilla/distilabeled-Marcoro14-7B-slerp](https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:19:50.264268](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp/blob/main/results_2024-01-13T15-19-50.264268.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6570721295894518,\n \"acc_stderr\": 0.03197567245123644,\n \"acc_norm\": 0.6568944940985069,\n \"acc_norm_stderr\": 0.032637536590426994,\n \"mc1\": 0.49326805385556916,\n \"mc1_stderr\": 0.017501914492655382,\n \"mc2\": 0.6510264969421982,\n \"mc2_stderr\": 0.01511928866079861\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6774744027303754,\n \"acc_stderr\": 0.013659980894277366,\n \"acc_norm\": 0.7073378839590444,\n \"acc_norm_stderr\": 0.013295916103619425\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6964748058155746,\n \"acc_stderr\": 0.0045884034194496656,\n \"acc_norm\": 0.874726150169289,\n \"acc_norm_stderr\": 0.003303526413123496\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.0399926287661772,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.0399926287661772\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n \"acc_stderr\": 0.016513676031179585,\n \"acc_norm\": 0.4212290502793296,\n \"acc_norm_stderr\": 0.016513676031179585\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008557,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008557\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n \"acc_stderr\": 0.012753716929101008,\n \"acc_norm\": 0.4745762711864407,\n \"acc_norm_stderr\": 0.012753716929101008\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528183,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528183\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49326805385556916,\n \"mc1_stderr\": 0.017501914492655382,\n \"mc2\": 0.6510264969421982,\n \"mc2_stderr\": 0.01511928866079861\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7119029567854435,\n \"acc_stderr\": 0.012474469737197916\n }\n}\n```", "repo_url": "https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-19-50.264268.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["**/details_harness|winogrande|5_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-19-50.264268.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_19_50.264268", "path": ["results_2024-01-13T15-19-50.264268.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-19-50.264268.parquet"]}]}]} | 2024-01-13T15:22:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of argilla/distilabeled-Marcoro14-7B-slerp
Dataset automatically created during the evaluation run of model argilla/distilabeled-Marcoro14-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:19:50.264268(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of argilla/distilabeled-Marcoro14-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model argilla/distilabeled-Marcoro14-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:19:50.264268(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of argilla/distilabeled-Marcoro14-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model argilla/distilabeled-Marcoro14-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:19:50.264268(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
fc55666e57a8123d33257a8b3b7892edd32a7da9 |
# Dataset Card for Evaluation run of abideen/NexoNimbus-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abideen/NexoNimbus-7B](https://huggingface.co/abideen/NexoNimbus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abideen__NexoNimbus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:21:36.768833](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__NexoNimbus-7B/blob/main/results_2024-01-13T15-21-36.768833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527701912575271,
"acc_stderr": 0.03198148294278928,
"acc_norm": 0.6519074704749058,
"acc_norm_stderr": 0.03265457793015111,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6242663878330903,
"mc2_stderr": 0.015486654235984039
},
"harness|arc:challenge|25": {
"acc": 0.6825938566552902,
"acc_stderr": 0.013602239088038167,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403516
},
"harness|hellaswag|10": {
"acc": 0.7086237801234814,
"acc_stderr": 0.004534677750102722,
"acc_norm": 0.8786098386775543,
"acc_norm_stderr": 0.0032591270576681724
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.036390575699529276,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.036390575699529276
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323788,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323788
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.0127569333828237,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.0127569333828237
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6242663878330903,
"mc2_stderr": 0.015486654235984039
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571776
},
"harness|gsm8k|5": {
"acc": 0.7035633055344959,
"acc_stderr": 0.012579398235589538
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abideen__NexoNimbus-7B | [
"region:us"
] | 2024-01-13T15:23:56+00:00 | {"pretty_name": "Evaluation run of abideen/NexoNimbus-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [abideen/NexoNimbus-7B](https://huggingface.co/abideen/NexoNimbus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abideen__NexoNimbus-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:21:36.768833](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__NexoNimbus-7B/blob/main/results_2024-01-13T15-21-36.768833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527701912575271,\n \"acc_stderr\": 0.03198148294278928,\n \"acc_norm\": 0.6519074704749058,\n \"acc_norm_stderr\": 0.03265457793015111,\n \"mc1\": 0.46266829865361075,\n \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6242663878330903,\n \"mc2_stderr\": 0.015486654235984039\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403516\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7086237801234814,\n \"acc_stderr\": 0.004534677750102722,\n \"acc_norm\": 0.8786098386775543,\n \"acc_norm_stderr\": 0.0032591270576681724\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.036390575699529276,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.036390575699529276\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323788,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323788\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n \"acc_stderr\": 0.0127569333828237,\n \"acc_norm\": 0.4771838331160365,\n \"acc_norm_stderr\": 0.0127569333828237\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6242663878330903,\n \"mc2_stderr\": 0.015486654235984039\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571776\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \"acc_stderr\": 0.012579398235589538\n }\n}\n```", "repo_url": "https://huggingface.co/abideen/NexoNimbus-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-21-36.768833.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["**/details_harness|winogrande|5_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-21-36.768833.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_21_36.768833", "path": ["results_2024-01-13T15-21-36.768833.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-21-36.768833.parquet"]}]}]} | 2024-01-13T15:24:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abideen/NexoNimbus-7B
Dataset automatically created during the evaluation run of model abideen/NexoNimbus-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:21:36.768833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abideen/NexoNimbus-7B\n\n\n\nDataset automatically created during the evaluation run of model abideen/NexoNimbus-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:21:36.768833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abideen/NexoNimbus-7B\n\n\n\nDataset automatically created during the evaluation run of model abideen/NexoNimbus-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:21:36.768833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e8a750c99d27e12e15f2237937f4338742c52edd |
Combination of EQUATE, FOLIO, and LogicInference_OA | euclaise/logician | [
"region:us"
] | 2024-01-13T15:27:22+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4485373.0, "num_examples": 7941}], "download_size": 1425488, "dataset_size": 4485373.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-30T12:15:53+00:00 | [] | [] | TAGS
#region-us
|
Combination of EQUATE, FOLIO, and LogicInference_OA | [] | [
"TAGS\n#region-us \n"
] |
7d5c9db58d746ee0de72d04c150a18d5a55c47fb |
# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:36:50.763352](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter2/blob/main/results_2024-01-13T15-36-50.763352.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6114503041359706,
"acc_stderr": 0.03288132466269303,
"acc_norm": 0.6172605395331842,
"acc_norm_stderr": 0.033549678952002004,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5782258262756715,
"mc2_stderr": 0.015856347434414303
},
"harness|arc:challenge|25": {
"acc": 0.628839590443686,
"acc_stderr": 0.014117971901142822,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205763
},
"harness|hellaswag|10": {
"acc": 0.6749651463851822,
"acc_stderr": 0.004674306182532131,
"acc_norm": 0.8583947420832504,
"acc_norm_stderr": 0.00347932286022565
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880263,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880263
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250948,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250948
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489284,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489284
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657564,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657564
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257803,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4471968709256845,
"acc_stderr": 0.012698825252435111,
"acc_norm": 0.4471968709256845,
"acc_norm_stderr": 0.012698825252435111
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5782258262756715,
"mc2_stderr": 0.015856347434414303
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827936
},
"harness|gsm8k|5": {
"acc": 0.3305534495830174,
"acc_stderr": 0.012957496367085026
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter2 | [
"region:us"
] | 2024-01-13T15:39:07+00:00 | {"pretty_name": "Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2", "dataset_summary": "Dataset automatically created during the evaluation run of model [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:36:50.763352](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter2/blob/main/results_2024-01-13T15-36-50.763352.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6114503041359706,\n \"acc_stderr\": 0.03288132466269303,\n \"acc_norm\": 0.6172605395331842,\n \"acc_norm_stderr\": 0.033549678952002004,\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5782258262756715,\n \"mc2_stderr\": 0.015856347434414303\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.628839590443686,\n \"acc_stderr\": 0.014117971901142822,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205763\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6749651463851822,\n \"acc_stderr\": 0.004674306182532131,\n \"acc_norm\": 0.8583947420832504,\n \"acc_norm_stderr\": 0.00347932286022565\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880263,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880263\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.7258064516129032,\n \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.017149858514250948,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.017149858514250948\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114968,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114968\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489284,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489284\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n \"acc_stderr\": 0.014143970276657564,\n \"acc_norm\": 0.8058748403575989,\n \"acc_norm_stderr\": 0.014143970276657564\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257803,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257803\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4471968709256845,\n \"acc_stderr\": 0.012698825252435111,\n \"acc_norm\": 0.4471968709256845,\n \"acc_norm_stderr\": 0.012698825252435111\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5782258262756715,\n \"mc2_stderr\": 0.015856347434414303\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827936\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3305534495830174,\n \"acc_stderr\": 0.012957496367085026\n }\n}\n```", "repo_url": "https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-36-50.763352.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["**/details_harness|winogrande|5_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-36-50.763352.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_36_50.763352", "path": ["results_2024-01-13T15-36-50.763352.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-36-50.763352.parquet"]}]}]} | 2024-01-13T15:39:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2
Dataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:36:50.763352(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:36:50.763352(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:36:50.763352(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1234ced6d5b631a2fedb9dfd35366cd214e594c2 | # Norwegian Translated SST-2 Dataset
## Dataset
### Overview
The dataset is a Norwegian machine-translation of the [Stanford Sentiment Treebank](https://huggingface.co/datasets/sst2) (SST-2). The original dataset comprises sentences extracted from movie reviews, accompanied by human annotations indicating their sentiment.
### Dataset Structure
The dataset has the following structure:
```json
{
"idx": int,
"sentence": str,
"label": int,
"sentence_nob": str
}
```
### Data Fields
- `idx`: Monotonically increasing index ID.
- `sentence`: Complete sentence expressing an opinion about a film.
- `label`: Sentiment of the opinion, either "negative" (0) or positive (1). The test set labels are hidden (-1).
- `sentence_nob`: Norwegian translation of the sentence. | Kushtrim/sst2-norwegian-bokmaal | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:no",
"language:en",
"region:us"
] | 2024-01-13T15:45:40+00:00 | {"language": ["no", "en"], "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"], "dataset_info": {"features": [{"name": "idx", "dtype": "int64"}, {"name": "sentence", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "sentence_nob", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8811984, "num_examples": 67349}], "download_size": 5800043, "dataset_size": 8811984}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T22:09:38+00:00 | [] | [
"no",
"en"
] | TAGS
#task_categories-text-classification #size_categories-10K<n<100K #language-Norwegian #language-English #region-us
| # Norwegian Translated SST-2 Dataset
## Dataset
### Overview
The dataset is a Norwegian machine-translation of the Stanford Sentiment Treebank (SST-2). The original dataset comprises sentences extracted from movie reviews, accompanied by human annotations indicating their sentiment.
### Dataset Structure
The dataset has the following structure:
### Data Fields
- 'idx': Monotonically increasing index ID.
- 'sentence': Complete sentence expressing an opinion about a film.
- 'label': Sentiment of the opinion, either "negative" (0) or positive (1). The test set labels are hidden (-1).
- 'sentence_nob': Norwegian translation of the sentence. | [
"# Norwegian Translated SST-2 Dataset",
"## Dataset",
"### Overview\n\nThe dataset is a Norwegian machine-translation of the Stanford Sentiment Treebank (SST-2). The original dataset comprises sentences extracted from movie reviews, accompanied by human annotations indicating their sentiment.",
"### Dataset Structure\n\nThe dataset has the following structure:",
"### Data Fields\n\n- 'idx': Monotonically increasing index ID.\n- 'sentence': Complete sentence expressing an opinion about a film.\n- 'label': Sentiment of the opinion, either \"negative\" (0) or positive (1). The test set labels are hidden (-1).\n- 'sentence_nob': Norwegian translation of the sentence."
] | [
"TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-Norwegian #language-English #region-us \n",
"# Norwegian Translated SST-2 Dataset",
"## Dataset",
"### Overview\n\nThe dataset is a Norwegian machine-translation of the Stanford Sentiment Treebank (SST-2). The original dataset comprises sentences extracted from movie reviews, accompanied by human annotations indicating their sentiment.",
"### Dataset Structure\n\nThe dataset has the following structure:",
"### Data Fields\n\n- 'idx': Monotonically increasing index ID.\n- 'sentence': Complete sentence expressing an opinion about a film.\n- 'label': Sentiment of the opinion, either \"negative\" (0) or positive (1). The test set labels are hidden (-1).\n- 'sentence_nob': Norwegian translation of the sentence."
] |
2321e77ea6dd4c6e1ab78f69323890e15ed5c68f | # Albanian Translated SST-2 Dataset
## Dataset
### Overview
The dataset is a Albanian machine-translation of the [Stanford Sentiment Treebank](https://huggingface.co/datasets/sst2) (SST-2). The original dataset comprises sentences extracted from movie reviews, accompanied by human annotations indicating their sentiment.
### Dataset Structure
The dataset has the following structure:
```json
{
"idx": int,
"sentence": str,
"label": int,
"sentence_alb": str
}
```
### Data Fields
- `idx`: Monotonically increasing index ID.
- `sentence`: Complete sentence expressing an opinion about a film.
- `label`: Sentiment of the opinion, either "negative" (0) or positive (1). The test set labels are hidden (-1).
- `sentence_alb`: Albanian translation of the sentence. | Kushtrim/sst2-albanian | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:sq",
"language:en",
"region:us"
] | 2024-01-13T15:45:44+00:00 | {"language": ["sq", "en"], "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"], "dataset_info": {"features": [{"name": "idx", "dtype": "int64"}, {"name": "sentence", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "sentence_alb", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9445185, "num_examples": 67349}], "download_size": 5969256, "dataset_size": 9445185}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T19:52:24+00:00 | [] | [
"sq",
"en"
] | TAGS
#task_categories-text-classification #size_categories-10K<n<100K #language-Albanian #language-English #region-us
| # Albanian Translated SST-2 Dataset
## Dataset
### Overview
The dataset is a Albanian machine-translation of the Stanford Sentiment Treebank (SST-2). The original dataset comprises sentences extracted from movie reviews, accompanied by human annotations indicating their sentiment.
### Dataset Structure
The dataset has the following structure:
### Data Fields
- 'idx': Monotonically increasing index ID.
- 'sentence': Complete sentence expressing an opinion about a film.
- 'label': Sentiment of the opinion, either "negative" (0) or positive (1). The test set labels are hidden (-1).
- 'sentence_alb': Albanian translation of the sentence. | [
"# Albanian Translated SST-2 Dataset",
"## Dataset",
"### Overview\n\nThe dataset is a Albanian machine-translation of the Stanford Sentiment Treebank (SST-2). The original dataset comprises sentences extracted from movie reviews, accompanied by human annotations indicating their sentiment.",
"### Dataset Structure\n\nThe dataset has the following structure:",
"### Data Fields\n\n- 'idx': Monotonically increasing index ID.\n- 'sentence': Complete sentence expressing an opinion about a film.\n- 'label': Sentiment of the opinion, either \"negative\" (0) or positive (1). The test set labels are hidden (-1).\n- 'sentence_alb': Albanian translation of the sentence."
] | [
"TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-Albanian #language-English #region-us \n",
"# Albanian Translated SST-2 Dataset",
"## Dataset",
"### Overview\n\nThe dataset is a Albanian machine-translation of the Stanford Sentiment Treebank (SST-2). The original dataset comprises sentences extracted from movie reviews, accompanied by human annotations indicating their sentiment.",
"### Dataset Structure\n\nThe dataset has the following structure:",
"### Data Fields\n\n- 'idx': Monotonically increasing index ID.\n- 'sentence': Complete sentence expressing an opinion about a film.\n- 'label': Sentiment of the opinion, either \"negative\" (0) or positive (1). The test set labels are hidden (-1).\n- 'sentence_alb': Albanian translation of the sentence."
] |
1df5cddd55717e9e893805bd535e6e35ced1ca0e |
# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:47:16.037619](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter0/blob/main/results_2024-01-13T15-47-16.037619.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6119702436885257,
"acc_stderr": 0.03283940141095218,
"acc_norm": 0.6171401402100634,
"acc_norm_stderr": 0.03350745031301805,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5034226817976417,
"mc2_stderr": 0.015916803874695535
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735567,
"acc_norm": 0.6356655290102389,
"acc_norm_stderr": 0.014063260279882417
},
"harness|hellaswag|10": {
"acc": 0.6543517227643896,
"acc_stderr": 0.00474607219107258,
"acc_norm": 0.8442541326428998,
"acc_norm_stderr": 0.0036187316588377175
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936337,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.01720857935778758,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.01720857935778758
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677006,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677006
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39217877094972065,
"acc_stderr": 0.016329061073207442,
"acc_norm": 0.39217877094972065,
"acc_norm_stderr": 0.016329061073207442
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.02570264026060374,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.02570264026060374
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.012716941720734806,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.012716941720734806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.019431775677037313,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.019431775677037313
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982055,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982055
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5034226817976417,
"mc2_stderr": 0.015916803874695535
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089686
},
"harness|gsm8k|5": {
"acc": 0.36315390447308565,
"acc_stderr": 0.013246614539839866
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter0 | [
"region:us"
] | 2024-01-13T15:48:47+00:00 | {"pretty_name": "Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0", "dataset_summary": "Dataset automatically created during the evaluation run of model [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:47:16.037619](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter0/blob/main/results_2024-01-13T15-47-16.037619.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6119702436885257,\n \"acc_stderr\": 0.03283940141095218,\n \"acc_norm\": 0.6171401402100634,\n \"acc_norm_stderr\": 0.03350745031301805,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5034226817976417,\n \"mc2_stderr\": 0.015916803874695535\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735567,\n \"acc_norm\": 0.6356655290102389,\n \"acc_norm_stderr\": 0.014063260279882417\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6543517227643896,\n \"acc_stderr\": 0.00474607219107258,\n \"acc_norm\": 0.8442541326428998,\n \"acc_norm_stderr\": 0.0036187316588377175\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936337,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936337\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.01720857935778758,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.01720857935778758\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677006,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677006\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39217877094972065,\n \"acc_stderr\": 0.016329061073207442,\n \"acc_norm\": 0.39217877094972065,\n \"acc_norm_stderr\": 0.016329061073207442\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.02570264026060374,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.02570264026060374\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n \"acc_stderr\": 0.012716941720734806,\n \"acc_norm\": 0.45436766623207303,\n \"acc_norm_stderr\": 0.012716941720734806\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.019431775677037313,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.019431775677037313\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982055,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982055\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5034226817976417,\n \"mc2_stderr\": 0.015916803874695535\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089686\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36315390447308565,\n \"acc_stderr\": 0.013246614539839866\n }\n}\n```", "repo_url": "https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-46-28.796344.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-47-16.037619.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["**/details_harness|winogrande|5_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["**/details_harness|winogrande|5_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-47-16.037619.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_46_28.796344", "path": ["results_2024-01-13T15-46-28.796344.parquet"]}, {"split": "2024_01_13T15_47_16.037619", "path": ["results_2024-01-13T15-47-16.037619.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-47-16.037619.parquet"]}]}]} | 2024-01-13T15:49:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0
Dataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:47:16.037619(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:47:16.037619(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:47:16.037619(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
49ecdd4f862c433415a7899934a3c62bcca71ef9 |
# Dataset Card for Evaluation run of davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo](https://huggingface.co/davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:50:10.127087](https://huggingface.co/datasets/open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo/blob/main/results_2024-01-13T15-50-10.127087.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2576008328471532,
"acc_stderr": 0.03077282985808684,
"acc_norm": 0.25844020358409664,
"acc_norm_stderr": 0.03151828137960803,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080501,
"mc2": 0.37376752806468966,
"mc2_stderr": 0.013846261711668974
},
"harness|arc:challenge|25": {
"acc": 0.34897610921501704,
"acc_stderr": 0.013928933461382494,
"acc_norm": 0.3583617747440273,
"acc_norm_stderr": 0.014012883334859866
},
"harness|hellaswag|10": {
"acc": 0.45817566221868156,
"acc_stderr": 0.004972293764978727,
"acc_norm": 0.6129257120095598,
"acc_norm_stderr": 0.0048608542408219695
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.033027898599017176,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.033027898599017176
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123387,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.02960562398177122,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.02960562398177122
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292323,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292323
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220575,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220575
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906864,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481006,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481006
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204416,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204416
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380548,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343578,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343578
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24050632911392406,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.24050632911392406,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28607918263090676,
"acc_stderr": 0.016160871405127522,
"acc_norm": 0.28607918263090676,
"acc_norm_stderr": 0.016160871405127522
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225624,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225624
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872395,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872395
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23402868318122555,
"acc_stderr": 0.010813585552659693,
"acc_norm": 0.23402868318122555,
"acc_norm_stderr": 0.010813585552659693
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.02488097151229427,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.02488097151229427
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667192,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667192
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1346938775510204,
"acc_stderr": 0.021855658840811615,
"acc_norm": 0.1346938775510204,
"acc_norm_stderr": 0.021855658840811615
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.036471685236832266,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.036471685236832266
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080501,
"mc2": 0.37376752806468966,
"mc2_stderr": 0.013846261711668974
},
"harness|winogrande|5": {
"acc": 0.6101026045777427,
"acc_stderr": 0.013707547317008463
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.0038289829787357117
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo | [
"region:us"
] | 2024-01-13T15:52:01+00:00 | {"pretty_name": "Evaluation run of davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo](https://huggingface.co/davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:50:10.127087](https://huggingface.co/datasets/open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo/blob/main/results_2024-01-13T15-50-10.127087.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2576008328471532,\n \"acc_stderr\": 0.03077282985808684,\n \"acc_norm\": 0.25844020358409664,\n \"acc_norm_stderr\": 0.03151828137960803,\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080501,\n \"mc2\": 0.37376752806468966,\n \"mc2_stderr\": 0.013846261711668974\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.34897610921501704,\n \"acc_stderr\": 0.013928933461382494,\n \"acc_norm\": 0.3583617747440273,\n \"acc_norm_stderr\": 0.014012883334859866\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45817566221868156,\n \"acc_stderr\": 0.004972293764978727,\n \"acc_norm\": 0.6129257120095598,\n \"acc_norm_stderr\": 0.0048608542408219695\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.17777777777777778,\n \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123387,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n \"acc_stderr\": 0.02960562398177122,\n \"acc_norm\": 0.18497109826589594,\n \"acc_norm_stderr\": 0.02960562398177122\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292323,\n \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292323\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220575,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220575\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906864,\n \"acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906864\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n \"acc_stderr\": 0.024580028921481006,\n \"acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.024580028921481006\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204416,\n \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204416\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380548,\n \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343578,\n \"acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343578\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.24050632911392406,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.24050632911392406,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.3542600896860987,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28607918263090676,\n \"acc_stderr\": 0.016160871405127522,\n \"acc_norm\": 0.28607918263090676,\n \"acc_norm_stderr\": 0.016160871405127522\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225624,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225624\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113592,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113592\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.27009646302250806,\n \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872395,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872395\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23402868318122555,\n \"acc_stderr\": 0.010813585552659693,\n \"acc_norm\": 0.23402868318122555,\n \"acc_norm_stderr\": 0.010813585552659693\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.02488097151229427,\n \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.02488097151229427\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667192,\n \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667192\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1346938775510204,\n \"acc_stderr\": 0.021855658840811615,\n \"acc_norm\": 0.1346938775510204,\n \"acc_norm_stderr\": 0.021855658840811615\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.036471685236832266,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.036471685236832266\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.035469769593931624,\n \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.035469769593931624\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080501,\n \"mc2\": 0.37376752806468966,\n \"mc2_stderr\": 0.013846261711668974\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6101026045777427,\n \"acc_stderr\": 0.013707547317008463\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \"acc_stderr\": 0.0038289829787357117\n }\n}\n```", "repo_url": "https://huggingface.co/davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-50-10.127087.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["**/details_harness|winogrande|5_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-50-10.127087.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_50_10.127087", "path": ["results_2024-01-13T15-50-10.127087.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-50-10.127087.parquet"]}]}]} | 2024-01-13T15:52:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo
Dataset automatically created during the evaluation run of model davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:50:10.127087(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo\n\n\n\nDataset automatically created during the evaluation run of model davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:50:10.127087(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo\n\n\n\nDataset automatically created during the evaluation run of model davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:50:10.127087(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4d54a3255ca8448bf73cec42279ec8bc94b79047 |
# Dataset Card for Evaluation run of Kquant03/Medusa-7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Medusa-7B-bf16](https://huggingface.co/Kquant03/Medusa-7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Medusa-7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:51:28.685005](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Medusa-7B-bf16/blob/main/results_2024-01-13T15-51-28.685005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5710190127480856,
"acc_stderr": 0.03332687168090072,
"acc_norm": 0.5813613001110371,
"acc_norm_stderr": 0.03416710231417157,
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431438,
"mc2": 0.5573938277943279,
"mc2_stderr": 0.015496591664705626
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642478,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467325
},
"harness|hellaswag|10": {
"acc": 0.6106353316072496,
"acc_stderr": 0.004866096880941442,
"acc_norm": 0.7998406691894046,
"acc_norm_stderr": 0.003993017173367213
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02487081525105709,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02487081525105709
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.0249393139069408,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.0249393139069408
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547822,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547822
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650744,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650744
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335833,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335833
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306376,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306376
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.015445716910998879,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.015445716910998879
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424523,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424523
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.026406145973625686,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.026406145973625686
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.01268397251359881,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.01268397251359881
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.576797385620915,
"acc_stderr": 0.01998780976948206,
"acc_norm": 0.576797385620915,
"acc_norm_stderr": 0.01998780976948206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431438,
"mc2": 0.5573938277943279,
"mc2_stderr": 0.015496591664705626
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998282
},
"harness|gsm8k|5": {
"acc": 0.01819560272934041,
"acc_stderr": 0.003681611894073872
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Medusa-7B-bf16 | [
"region:us"
] | 2024-01-13T15:53:48+00:00 | {"pretty_name": "Evaluation run of Kquant03/Medusa-7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Medusa-7B-bf16](https://huggingface.co/Kquant03/Medusa-7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Medusa-7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:51:28.685005](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Medusa-7B-bf16/blob/main/results_2024-01-13T15-51-28.685005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5710190127480856,\n \"acc_stderr\": 0.03332687168090072,\n \"acc_norm\": 0.5813613001110371,\n \"acc_norm_stderr\": 0.03416710231417157,\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431438,\n \"mc2\": 0.5573938277943279,\n \"mc2_stderr\": 0.015496591664705626\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642478,\n \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467325\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6106353316072496,\n \"acc_stderr\": 0.004866096880941442,\n \"acc_norm\": 0.7998406691894046,\n \"acc_norm_stderr\": 0.003993017173367213\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02487081525105709,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02487081525105709\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.0249393139069408,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.0249393139069408\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547822,\n \"acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547822\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650744,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650744\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n \"acc_stderr\": 0.014957458504335833,\n \"acc_norm\": 0.7739463601532567,\n \"acc_norm_stderr\": 0.014957458504335833\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306376,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306376\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.015445716910998879,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.015445716910998879\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424523,\n \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424523\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.026406145973625686,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.026406145973625686\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n \"acc_stderr\": 0.01268397251359881,\n \"acc_norm\": 0.44198174706649285,\n \"acc_norm_stderr\": 0.01268397251359881\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.576797385620915,\n \"acc_stderr\": 0.01998780976948206,\n \"acc_norm\": 0.576797385620915,\n \"acc_norm_stderr\": 0.01998780976948206\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431438,\n \"mc2\": 0.5573938277943279,\n \"mc2_stderr\": 0.015496591664705626\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \"acc_stderr\": 0.003681611894073872\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Medusa-7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-51-28.685005.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["**/details_harness|winogrande|5_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-51-28.685005.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_51_28.685005", "path": ["results_2024-01-13T15-51-28.685005.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-51-28.685005.parquet"]}]}]} | 2024-01-13T15:54:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Medusa-7B-bf16
Dataset automatically created during the evaluation run of model Kquant03/Medusa-7B-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:51:28.685005(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Medusa-7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Medusa-7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:51:28.685005(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Medusa-7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Medusa-7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:51:28.685005(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
eb733ea70b4262228f1f2910e088b70d182aa97e |
# Dataset Card for Evaluation run of UCLA-AGI/test_final
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/test_final](https://huggingface.co/UCLA-AGI/test_final) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__test_final",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:52:58.260309](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test_final/blob/main/results_2024-01-13T15-52-58.260309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6144035496773548,
"acc_stderr": 0.032858739117399755,
"acc_norm": 0.6200519616024565,
"acc_norm_stderr": 0.03352475225298005,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5789464689775264,
"mc2_stderr": 0.015807009741465705
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491887,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.676458872734515,
"acc_stderr": 0.0046687106891924,
"acc_norm": 0.8584943238398726,
"acc_norm_stderr": 0.0034783009945146973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709447,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489284,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489284
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917212,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917212
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865464,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379778,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379778
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.0196438015579248,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.0196438015579248
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5789464689775264,
"mc2_stderr": 0.015807009741465705
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.3419257012888552,
"acc_stderr": 0.0130660896251828
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_UCLA-AGI__test_final | [
"region:us"
] | 2024-01-13T15:55:16+00:00 | {"pretty_name": "Evaluation run of UCLA-AGI/test_final", "dataset_summary": "Dataset automatically created during the evaluation run of model [UCLA-AGI/test_final](https://huggingface.co/UCLA-AGI/test_final) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__test_final\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T15:52:58.260309](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test_final/blob/main/results_2024-01-13T15-52-58.260309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6144035496773548,\n \"acc_stderr\": 0.032858739117399755,\n \"acc_norm\": 0.6200519616024565,\n \"acc_norm_stderr\": 0.03352475225298005,\n \"mc1\": 0.4222766217870257,\n \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5789464689775264,\n \"mc2_stderr\": 0.015807009741465705\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491887,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.676458872734515,\n \"acc_stderr\": 0.0046687106891924,\n \"acc_norm\": 0.8584943238398726,\n \"acc_norm_stderr\": 0.0034783009945146973\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709447,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489284,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489284\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917212,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917212\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n \"acc_stderr\": 0.016115235504865464,\n \"acc_norm\": 0.3664804469273743,\n \"acc_norm_stderr\": 0.016115235504865464\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n \"acc_stderr\": 0.012695244711379778,\n \"acc_norm\": 0.44589308996088656,\n \"acc_norm_stderr\": 0.012695244711379778\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.619281045751634,\n \"acc_stderr\": 0.0196438015579248,\n \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.0196438015579248\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5789464689775264,\n \"mc2_stderr\": 0.015807009741465705\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3419257012888552,\n \"acc_stderr\": 0.0130660896251828\n }\n}\n```", "repo_url": "https://huggingface.co/UCLA-AGI/test_final", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T15-52-58.260309.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["**/details_harness|winogrande|5_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T15-52-58.260309.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T15_52_58.260309", "path": ["results_2024-01-13T15-52-58.260309.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T15-52-58.260309.parquet"]}]}]} | 2024-01-13T15:55:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of UCLA-AGI/test_final
Dataset automatically created during the evaluation run of model UCLA-AGI/test_final on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T15:52:58.260309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of UCLA-AGI/test_final\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/test_final on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:52:58.260309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of UCLA-AGI/test_final\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/test_final on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T15:52:58.260309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
bca492d5b414ce1e67a499b30470e2376b8a7d7f | # Dataset Card for hh-rlhf-strength-cleaned
**Other Language Versions: [English](README.md), [中文](README_zh.md).**
## Dataset Description
In the paper titled "[Secrets of RLHF in Large Language Models Part II: Reward Modeling](https://arxiv.org/abs/2401.06080)" we measured the preference strength of each preference pair in the [hh-rlhf](https://huggingface.co/datasets/Anthropic/hh-rlhf "https://huggingface.co/datasets/Anthropic/hh-rlhf") dataset through model ensemble and annotated the valid set with GPT-4. In this repository, we provide:
1. Metadata of preference strength for both the training and valid sets.
2. GPT-4 annotations on the valid set.
We mixed the hh-rlhf dataset and redivided it into a training set (151k) and a valid set (17k) with a ratio of 9:1.
## Field Description
| Field Name | Field Description | Remarks |
| --------------------------- | ------------------------------------------------------------------------------ | ------------------------------------- |
| chosen | Same as the hh-rlhf dataset. The last line represents the chosen response, and the preceding lines constitute the dialogue history | Type is a list. The dialogue history for both chosen and rejected responses is the same |
| rejected | Same as the hh-rlhf dataset. The last line represents the chosen response, and the preceding lines constitute the dialogue history | Type is a list. The dialogue history for both chosen and rejected responses is the same |
| GPT4 label | GPT-4 annotation for preference pairs; 1 indicates GPT-4 prefers chosen, 0 indicates GPT-4 prefers rejected, and -1 indicates that the label does not exist | Only present in the valid set |
| mean preference difference | Metric measuring preference strength as discussed in the paper; absolute value indicates the magnitude, and positive/negative indicates preference for chosen or rejected, respectively | Average of preference strengths across N models |
| std preference difference | Metric measuring uncertainty in preference strength, representing the standard deviation among preference strengths from different models | Standard deviation of preference strengths across N models |
| chosen score list | List of scores given by N models for the chosen option in each preference pair | Type is a list, each element represents the score given by a single model |
| rejected score list | List of scores given by N models for the rejected option in each preference pair | Type is a list, each element represents the score given by a single model | | fnlp/hh-rlhf-strength-cleaned | [
"license:apache-2.0",
"arxiv:2401.06080",
"region:us"
] | 2024-01-13T15:59:59+00:00 | {"license": "apache-2.0"} | 2024-01-31T13:56:07+00:00 | [
"2401.06080"
] | [] | TAGS
#license-apache-2.0 #arxiv-2401.06080 #region-us
| Dataset Card for hh-rlhf-strength-cleaned
=========================================
Other Language Versions: English, 中文.
Dataset Description
-------------------
In the paper titled "Secrets of RLHF in Large Language Models Part II: Reward Modeling" we measured the preference strength of each preference pair in the hh-rlhf dataset through model ensemble and annotated the valid set with GPT-4. In this repository, we provide:
1. Metadata of preference strength for both the training and valid sets.
2. GPT-4 annotations on the valid set.
We mixed the hh-rlhf dataset and redivided it into a training set (151k) and a valid set (17k) with a ratio of 9:1.
Field Description
-----------------
Field Name: chosen, Field Description: Same as the hh-rlhf dataset. The last line represents the chosen response, and the preceding lines constitute the dialogue history, Remarks: Type is a list. The dialogue history for both chosen and rejected responses is the same
Field Name: rejected, Field Description: Same as the hh-rlhf dataset. The last line represents the chosen response, and the preceding lines constitute the dialogue history, Remarks: Type is a list. The dialogue history for both chosen and rejected responses is the same
Field Name: GPT4 label, Field Description: GPT-4 annotation for preference pairs; 1 indicates GPT-4 prefers chosen, 0 indicates GPT-4 prefers rejected, and -1 indicates that the label does not exist, Remarks: Only present in the valid set
Field Name: mean preference difference, Field Description: Metric measuring preference strength as discussed in the paper; absolute value indicates the magnitude, and positive/negative indicates preference for chosen or rejected, respectively, Remarks: Average of preference strengths across N models
Field Name: std preference difference, Field Description: Metric measuring uncertainty in preference strength, representing the standard deviation among preference strengths from different models, Remarks: Standard deviation of preference strengths across N models
Field Name: chosen score list, Field Description: List of scores given by N models for the chosen option in each preference pair, Remarks: Type is a list, each element represents the score given by a single model
Field Name: rejected score list, Field Description: List of scores given by N models for the rejected option in each preference pair, Remarks: Type is a list, each element represents the score given by a single model
| [] | [
"TAGS\n#license-apache-2.0 #arxiv-2401.06080 #region-us \n"
] |
18d9f6cfdfd509fb83bedbd3ff13ebb596ef1d34 |
[LaplaceAI 繁中領域知識資料集計畫](https://huggingface.co/laplace-ai-lab)
利用我在**爬蟲自動化與資料後處理**上的專業,針對不同大小的領域知識資料集進行建立與維護。
在 LaplaceAI 的 huggingface 頁面,你可以找到許多不同領域的資料集。
這項 datasets 是由 LaplaceAI 整理維護的**牙科相關知識**。
| DataAgent/Pretrain-Taiwan-DentistKnowledge-zhTW-290K | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:zh",
"license:gpl-3.0",
"medical",
"region:us"
] | 2024-01-13T16:16:19+00:00 | {"language": ["zh"], "license": "gpl-3.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "pretty_name": "f", "tags": ["medical"]} | 2024-01-13T16:26:21+00:00 | [] | [
"zh"
] | TAGS
#task_categories-text-generation #size_categories-100K<n<1M #language-Chinese #license-gpl-3.0 #medical #region-us
|
LaplaceAI 繁中領域知識資料集計畫
利用我在爬蟲自動化與資料後處理上的專業,針對不同大小的領域知識資料集進行建立與維護。
在 LaplaceAI 的 huggingface 頁面,你可以找到許多不同領域的資料集。
這項 datasets 是由 LaplaceAI 整理維護的牙科相關知識。
| [] | [
"TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-Chinese #license-gpl-3.0 #medical #region-us \n"
] |
e66a18ade152b48eeefc3f173b646c2e7cb251d5 |
# Dataset Card for Evaluation run of Praneeth/StarMix-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Praneeth/StarMix-7B-slerp](https://huggingface.co/Praneeth/StarMix-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Praneeth__StarMix-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T16:15:20.775949](https://huggingface.co/datasets/open-llm-leaderboard/details_Praneeth__StarMix-7B-slerp/blob/main/results_2024-01-13T16-15-20.775949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.627260065696873,
"acc_stderr": 0.03261645400649572,
"acc_norm": 0.630018525574998,
"acc_norm_stderr": 0.033271872204890376,
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5781119137521107,
"mc2_stderr": 0.015588663872119645
},
"harness|arc:challenge|25": {
"acc": 0.6032423208191127,
"acc_stderr": 0.014296513020180642,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063228
},
"harness|hellaswag|10": {
"acc": 0.6568412666799442,
"acc_stderr": 0.004737936758047631,
"acc_norm": 0.8510256920932086,
"acc_norm_stderr": 0.003553354528132363
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924006,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924006
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548302,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.02466674491518722,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.02466674491518722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.01653061740926685,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.01653061740926685
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.01385372417092253,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.01385372417092253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.016619881988177022,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.016619881988177022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.02977945095730307,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.02977945095730307
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277743,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277743
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5781119137521107,
"mc2_stderr": 0.015588663872119645
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205083
},
"harness|gsm8k|5": {
"acc": 0.5367702805155421,
"acc_stderr": 0.01373519195646865
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Praneeth__StarMix-7B-slerp | [
"region:us"
] | 2024-01-13T16:17:37+00:00 | {"pretty_name": "Evaluation run of Praneeth/StarMix-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Praneeth/StarMix-7B-slerp](https://huggingface.co/Praneeth/StarMix-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Praneeth__StarMix-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T16:15:20.775949](https://huggingface.co/datasets/open-llm-leaderboard/details_Praneeth__StarMix-7B-slerp/blob/main/results_2024-01-13T16-15-20.775949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.627260065696873,\n \"acc_stderr\": 0.03261645400649572,\n \"acc_norm\": 0.630018525574998,\n \"acc_norm_stderr\": 0.033271872204890376,\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5781119137521107,\n \"mc2_stderr\": 0.015588663872119645\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6032423208191127,\n \"acc_stderr\": 0.014296513020180642,\n \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063228\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6568412666799442,\n \"acc_stderr\": 0.004737936758047631,\n \"acc_norm\": 0.8510256920932086,\n \"acc_norm_stderr\": 0.003553354528132363\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924006,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924006\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\": 0.667741935483871,\n \"acc_norm_stderr\": 0.0267955608481228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548302,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548302\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518722,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.01653061740926685,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.01653061740926685\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.01385372417092253,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.01385372417092253\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n \"acc_stderr\": 0.016619881988177022,\n \"acc_norm\": 0.4446927374301676,\n \"acc_norm_stderr\": 0.016619881988177022\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.012715404841277743,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.012715404841277743\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5781119137521107,\n \"mc2_stderr\": 0.015588663872119645\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205083\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5367702805155421,\n \"acc_stderr\": 0.01373519195646865\n }\n}\n```", "repo_url": "https://huggingface.co/Praneeth/StarMix-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-15-20.775949.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["**/details_harness|winogrande|5_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T16-15-20.775949.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T16_15_20.775949", "path": ["results_2024-01-13T16-15-20.775949.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T16-15-20.775949.parquet"]}]}]} | 2024-01-13T16:18:01+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Praneeth/StarMix-7B-slerp
Dataset automatically created during the evaluation run of model Praneeth/StarMix-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T16:15:20.775949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Praneeth/StarMix-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Praneeth/StarMix-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:15:20.775949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Praneeth/StarMix-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Praneeth/StarMix-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:15:20.775949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
eec8668718d5c20f0add889b68ee73c9d864986d |
# Dataset Card for Evaluation run of Kquant03/Hippolyta-7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Hippolyta-7B-bf16](https://huggingface.co/Kquant03/Hippolyta-7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Hippolyta-7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T16:20:49.648715](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Hippolyta-7B-bf16/blob/main/results_2024-01-13T16-20-49.648715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5710190127480856,
"acc_stderr": 0.03332687168090072,
"acc_norm": 0.5813613001110371,
"acc_norm_stderr": 0.03416710231417157,
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431438,
"mc2": 0.5573938277943279,
"mc2_stderr": 0.015496591664705626
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642478,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467325
},
"harness|hellaswag|10": {
"acc": 0.6106353316072496,
"acc_stderr": 0.004866096880941442,
"acc_norm": 0.7998406691894046,
"acc_norm_stderr": 0.003993017173367213
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02487081525105709,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02487081525105709
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.0249393139069408,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.0249393139069408
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547822,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547822
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650744,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650744
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335833,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335833
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306376,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306376
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.015445716910998879,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.015445716910998879
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424523,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424523
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.026406145973625686,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.026406145973625686
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.01268397251359881,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.01268397251359881
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.576797385620915,
"acc_stderr": 0.01998780976948206,
"acc_norm": 0.576797385620915,
"acc_norm_stderr": 0.01998780976948206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431438,
"mc2": 0.5573938277943279,
"mc2_stderr": 0.015496591664705626
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998282
},
"harness|gsm8k|5": {
"acc": 0.01819560272934041,
"acc_stderr": 0.003681611894073872
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Hippolyta-7B-bf16 | [
"region:us"
] | 2024-01-13T16:23:11+00:00 | {"pretty_name": "Evaluation run of Kquant03/Hippolyta-7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Hippolyta-7B-bf16](https://huggingface.co/Kquant03/Hippolyta-7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Hippolyta-7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T16:20:49.648715](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Hippolyta-7B-bf16/blob/main/results_2024-01-13T16-20-49.648715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5710190127480856,\n \"acc_stderr\": 0.03332687168090072,\n \"acc_norm\": 0.5813613001110371,\n \"acc_norm_stderr\": 0.03416710231417157,\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431438,\n \"mc2\": 0.5573938277943279,\n \"mc2_stderr\": 0.015496591664705626\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642478,\n \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467325\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6106353316072496,\n \"acc_stderr\": 0.004866096880941442,\n \"acc_norm\": 0.7998406691894046,\n \"acc_norm_stderr\": 0.003993017173367213\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02487081525105709,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02487081525105709\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.0249393139069408,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.0249393139069408\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547822,\n \"acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547822\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650744,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650744\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n \"acc_stderr\": 0.014957458504335833,\n \"acc_norm\": 0.7739463601532567,\n \"acc_norm_stderr\": 0.014957458504335833\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306376,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306376\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.015445716910998879,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.015445716910998879\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424523,\n \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424523\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.026406145973625686,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.026406145973625686\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n \"acc_stderr\": 0.01268397251359881,\n \"acc_norm\": 0.44198174706649285,\n \"acc_norm_stderr\": 0.01268397251359881\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.576797385620915,\n \"acc_stderr\": 0.01998780976948206,\n \"acc_norm\": 0.576797385620915,\n \"acc_norm_stderr\": 0.01998780976948206\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431438,\n \"mc2\": 0.5573938277943279,\n \"mc2_stderr\": 0.015496591664705626\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \"acc_stderr\": 0.003681611894073872\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Hippolyta-7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-20-49.648715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["**/details_harness|winogrande|5_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T16-20-49.648715.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T16_20_49.648715", "path": ["results_2024-01-13T16-20-49.648715.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T16-20-49.648715.parquet"]}]}]} | 2024-01-13T16:23:31+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Hippolyta-7B-bf16
Dataset automatically created during the evaluation run of model Kquant03/Hippolyta-7B-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T16:20:49.648715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Hippolyta-7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Hippolyta-7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:20:49.648715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Hippolyta-7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Hippolyta-7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:20:49.648715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d7559feddc6df43691cf1f72eae2bdac5d549e80 |
# Dataset of m1895/M1895/纳甘左轮 (Girls' Frontline)
This is the dataset of m1895/M1895/纳甘左轮 (Girls' Frontline), containing 204 images and their tags.
The core tags of this character are `blonde_hair, red_eyes, hat, bangs, long_hair, hair_between_eyes, fur_hat, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 204 | 331.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 204 | 154.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 531 | 375.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 204 | 275.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 531 | 577.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m1895_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, belt_buckle, blush, brown_gloves, brown_skirt, fingerless_gloves, handgun, holding_gun, jacket_on_shoulders, long_sleeves, revolver, solo, white_jacket, white_shirt, brown_belt, center_frills, looking_at_viewer, object_namesake, open_mouth, smile, black_gloves, black_socks, kneehighs, one_eye_closed, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, belt_buckle, blush, brown_skirt, center_frills, jacket_on_shoulders, long_sleeves, solo, white_jacket, white_shirt, fake_animal_ears, mod3_(girls'_frontline), pleated_skirt, animal_ear_fluff, animal_hat, brown_belt, looking_at_viewer, open_mouth, red_belt, white_background, brown_gloves, simple_background, single_glove, :d, fingerless_gloves, hand_on_hip, star_(symbol) |
| 2 | 8 |  |  |  |  |  | 1girl, blush, brown_belt, brown_skirt, center_frills, long_sleeves, white_jacket, white_shirt, belt_buckle, holding, simple_background, solo, black_gloves, brown_gloves, fingerless_gloves, jacket_on_shoulders, open_mouth, :d, grey_background, looking_at_viewer, ushanka, brown_background, closed_eyes, white_background |
| 3 | 9 |  |  |  |  |  | 1girl, blush, fingerless_gloves, solo, white_jacket, white_shirt, long_sleeves, upper_body, black_gloves, brown_gloves, open_mouth, simple_background, looking_at_viewer, white_background, center_frills, :d |
| 4 | 9 |  |  |  |  |  | 1girl, :d, black_headwear, blue_cape, blue_flower, hair_flower, mini_top_hat, official_alternate_costume, open_mouth, solo, tilted_headwear, blush, vertical_stripes, electric_guitar, holding_instrument, looking_at_viewer, puffy_short_sleeves, braid, mismatched_gloves, one_side_up, striped_gloves, black_shirt, elbow_gloves, fingerless_gloves, frills, holding_microphone, long_sleeves, upper_body, white_gloves, white_shirt, ascot, collared_shirt, jacket, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, blue_headwear, earrings, eyewear_on_head, official_alternate_costume, simple_background, sunglasses, ahoge, blue_flower, hair_flower, looking_at_viewer, smile, solo, white_background, bare_shoulders, blue_dress, blush, choker, closed_mouth, upper_body, black_dress, collarbone, mini_top_hat, single_hair_bun |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | belt_buckle | blush | brown_gloves | brown_skirt | fingerless_gloves | handgun | holding_gun | jacket_on_shoulders | long_sleeves | revolver | solo | white_jacket | white_shirt | brown_belt | center_frills | looking_at_viewer | object_namesake | open_mouth | smile | black_gloves | black_socks | kneehighs | one_eye_closed | white_background | fake_animal_ears | mod3_(girls'_frontline) | pleated_skirt | animal_ear_fluff | animal_hat | red_belt | simple_background | single_glove | :d | hand_on_hip | star_(symbol) | holding | grey_background | ushanka | brown_background | closed_eyes | upper_body | black_headwear | blue_cape | blue_flower | hair_flower | mini_top_hat | official_alternate_costume | tilted_headwear | vertical_stripes | electric_guitar | holding_instrument | puffy_short_sleeves | braid | mismatched_gloves | one_side_up | striped_gloves | black_shirt | elbow_gloves | frills | holding_microphone | white_gloves | ascot | collared_shirt | jacket | blue_headwear | earrings | eyewear_on_head | sunglasses | ahoge | bare_shoulders | blue_dress | choker | closed_mouth | black_dress | collarbone | single_hair_bun |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------|:---------------|:--------------|:--------------------|:----------|:--------------|:----------------------|:---------------|:-----------|:-------|:---------------|:--------------|:-------------|:----------------|:--------------------|:------------------|:-------------|:--------|:---------------|:--------------|:------------|:-----------------|:-------------------|:-------------------|:--------------------------|:----------------|:-------------------|:-------------|:-----------|:--------------------|:---------------|:-----|:--------------|:----------------|:----------|:------------------|:----------|:-------------------|:--------------|:-------------|:-----------------|:------------|:--------------|:--------------|:---------------|:-----------------------------|:------------------|:-------------------|:------------------|:---------------------|:----------------------|:--------|:--------------------|:--------------|:-----------------|:--------------|:---------------|:---------|:---------------------|:---------------|:--------|:-----------------|:---------|:----------------|:-----------|:------------------|:-------------|:--------|:-----------------|:-------------|:---------|:---------------|:--------------|:-------------|:------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | | X | X | X | X | X | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | | X | X | X | X | X | X | | X | | X | | | | X | | | | | | | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | X | X | | X | | | | X | | X | X | X | | X | X | | X | | X | | | | X | | | | | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | X | | | X | | | | X | | X | | X | | | X | | X | | | | | | X | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | | | | | | | | X | | | | | X | | | X | | | | | X | | | | | | | X | | | | | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m1895_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T16:45:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T17:31:20+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of m1895/M1895/纳甘左轮 (Girls' Frontline)
==============================================
This is the dataset of m1895/M1895/纳甘左轮 (Girls' Frontline), containing 204 images and their tags.
The core tags of this character are 'blonde\_hair, red\_eyes, hat, bangs, long\_hair, hair\_between\_eyes, fur\_hat, white\_headwear', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2ddcecfe0fb2d1acd1757f8f0fd644f2f93fe18f |
# Dataset Card for Evaluation run of superlazycoder/NeuralPipe-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [superlazycoder/NeuralPipe-7B-slerp](https://huggingface.co/superlazycoder/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T16:47:37.959217](https://huggingface.co/datasets/open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp/blob/main/results_2024-01-13T16-47-37.959217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6445269708058093,
"acc_stderr": 0.03218714474134609,
"acc_norm": 0.6449418405596148,
"acc_norm_stderr": 0.03284511879516387,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598408044881861,
"mc2_stderr": 0.015149948573522944
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.013960142600598675,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518829
},
"harness|hellaswag|10": {
"acc": 0.6701852220673172,
"acc_stderr": 0.0046918486653990685,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.003445289925011734
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.0257449025322909,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.0257449025322909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323793,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853697,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853697
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015058,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598408044881861,
"mc2_stderr": 0.015149948573522944
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.01120186274448705
},
"harness|gsm8k|5": {
"acc": 0.6823351023502654,
"acc_stderr": 0.012824066621488845
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp | [
"region:us"
] | 2024-01-13T16:49:59+00:00 | {"pretty_name": "Evaluation run of superlazycoder/NeuralPipe-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [superlazycoder/NeuralPipe-7B-slerp](https://huggingface.co/superlazycoder/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T16:47:37.959217](https://huggingface.co/datasets/open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp/blob/main/results_2024-01-13T16-47-37.959217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6445269708058093,\n \"acc_stderr\": 0.03218714474134609,\n \"acc_norm\": 0.6449418405596148,\n \"acc_norm_stderr\": 0.03284511879516387,\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598675,\n \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518829\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6701852220673172,\n \"acc_stderr\": 0.0046918486653990685,\n \"acc_norm\": 0.8616809400517825,\n \"acc_norm_stderr\": 0.003445289925011734\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323793,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323793\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6823351023502654,\n \"acc_stderr\": 0.012824066621488845\n }\n}\n```", "repo_url": "https://huggingface.co/superlazycoder/NeuralPipe-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-47-37.959217.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["**/details_harness|winogrande|5_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T16-47-37.959217.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T16_47_37.959217", "path": ["results_2024-01-13T16-47-37.959217.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T16-47-37.959217.parquet"]}]}]} | 2024-01-13T16:50:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of superlazycoder/NeuralPipe-7B-slerp
Dataset automatically created during the evaluation run of model superlazycoder/NeuralPipe-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T16:47:37.959217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of superlazycoder/NeuralPipe-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model superlazycoder/NeuralPipe-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:47:37.959217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of superlazycoder/NeuralPipe-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model superlazycoder/NeuralPipe-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:47:37.959217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
14ecaef2c7472844e3685817f759bff8eec77e81 |
# Dataset Card for Evaluation run of fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser](https://huggingface.co/fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T16:50:26.517326](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser/blob/main/results_2024-01-13T16-50-26.517326.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6337373508284295,
"acc_stderr": 0.032383816315627395,
"acc_norm": 0.6380892308683148,
"acc_norm_stderr": 0.03302967908991101,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6414925770737219,
"mc2_stderr": 0.015103448074375492
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955009,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537304
},
"harness|hellaswag|10": {
"acc": 0.6719776936865166,
"acc_stderr": 0.00468533484403866,
"acc_norm": 0.8630750846444931,
"acc_norm_stderr": 0.0034306550069275825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887048,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887048
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010344,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010344
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437413,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437413
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389104,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389104
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287414,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287414
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990925,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990925
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834829,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834829
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134135,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134135
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.01612554382355295,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.01612554382355295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.01268201633564667,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.01268201633564667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.01924978569171721,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.01924978569171721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6414925770737219,
"mc2_stderr": 0.015103448074375492
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386779
},
"harness|gsm8k|5": {
"acc": 0.44351781652767247,
"acc_stderr": 0.013684327592606163
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser | [
"region:us"
] | 2024-01-13T16:52:42+00:00 | {"pretty_name": "Evaluation run of fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser", "dataset_summary": "Dataset automatically created during the evaluation run of model [fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser](https://huggingface.co/fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T16:50:26.517326](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser/blob/main/results_2024-01-13T16-50-26.517326.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6337373508284295,\n \"acc_stderr\": 0.032383816315627395,\n \"acc_norm\": 0.6380892308683148,\n \"acc_norm_stderr\": 0.03302967908991101,\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6414925770737219,\n \"mc2_stderr\": 0.015103448074375492\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955009,\n \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537304\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6719776936865166,\n \"acc_stderr\": 0.00468533484403866,\n \"acc_norm\": 0.8630750846444931,\n \"acc_norm_stderr\": 0.0034306550069275825\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.037827289808654685,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.037827289808654685\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887048,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887048\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010344,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010344\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437413,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437413\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389104,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389104\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287414,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287414\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.020237149008990925,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.020237149008990925\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834829,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834829\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134135,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134135\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n \"acc_stderr\": 0.01612554382355295,\n \"acc_norm\": 0.3675977653631285,\n \"acc_norm_stderr\": 0.01612554382355295\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n \"acc_stderr\": 0.01268201633564667,\n \"acc_norm\": 0.44132985658409385,\n \"acc_norm_stderr\": 0.01268201633564667\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6414925770737219,\n \"mc2_stderr\": 0.015103448074375492\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386779\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44351781652767247,\n \"acc_stderr\": 0.013684327592606163\n }\n}\n```", "repo_url": "https://huggingface.co/fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-50-26.517326.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["**/details_harness|winogrande|5_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T16-50-26.517326.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T16_50_26.517326", "path": ["results_2024-01-13T16-50-26.517326.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T16-50-26.517326.parquet"]}]}]} | 2024-01-13T16:53:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser
Dataset automatically created during the evaluation run of model fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T16:50:26.517326(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser\n\n\n\nDataset automatically created during the evaluation run of model fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:50:26.517326(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser\n\n\n\nDataset automatically created during the evaluation run of model fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:50:26.517326(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
237746cbf32d063ef33fc0dcbbfd9581d82eb341 |
# Dataset Card for Evaluation run of SanjiWatsuki/Kunoichi-DPO-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/Kunoichi-DPO-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T16:55:23.006879](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-7B/blob/main/results_2024-01-13T16-55-23.006879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6516847775816889,
"acc_stderr": 0.03213509047374522,
"acc_norm": 0.652545344935636,
"acc_norm_stderr": 0.0327865460086491,
"mc1": 0.5030599755201959,
"mc1_stderr": 0.01750317326096063,
"mc2": 0.6730598038833626,
"mc2_stderr": 0.015329856799704078
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587335,
"acc_norm": 0.6962457337883959,
"acc_norm_stderr": 0.013438909184778768
},
"harness|hellaswag|10": {
"acc": 0.7013543118900617,
"acc_stderr": 0.004567287775700561,
"acc_norm": 0.8714399522007569,
"acc_norm_stderr": 0.003340282993990809
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.01660256461504993,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.01660256461504993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781873,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.01924978569171721,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.01924978569171721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5030599755201959,
"mc1_stderr": 0.01750317326096063,
"mc2": 0.6730598038833626,
"mc2_stderr": 0.015329856799704078
},
"harness|winogrande|5": {
"acc": 0.8058405682715075,
"acc_stderr": 0.01111698339239266
},
"harness|gsm8k|5": {
"acc": 0.6398786959818044,
"acc_stderr": 0.01322255942325049
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-7B | [
"region:us"
] | 2024-01-13T16:57:41+00:00 | {"pretty_name": "Evaluation run of SanjiWatsuki/Kunoichi-DPO-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [SanjiWatsuki/Kunoichi-DPO-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T16:55:23.006879](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-7B/blob/main/results_2024-01-13T16-55-23.006879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6516847775816889,\n \"acc_stderr\": 0.03213509047374522,\n \"acc_norm\": 0.652545344935636,\n \"acc_norm_stderr\": 0.0327865460086491,\n \"mc1\": 0.5030599755201959,\n \"mc1_stderr\": 0.01750317326096063,\n \"mc2\": 0.6730598038833626,\n \"mc2_stderr\": 0.015329856799704078\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587335,\n \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.013438909184778768\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7013543118900617,\n \"acc_stderr\": 0.004567287775700561,\n \"acc_norm\": 0.8714399522007569,\n \"acc_norm_stderr\": 0.003340282993990809\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.01660256461504993,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.01660256461504993\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781873,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5030599755201959,\n \"mc1_stderr\": 0.01750317326096063,\n \"mc2\": 0.6730598038833626,\n \"mc2_stderr\": 0.015329856799704078\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8058405682715075,\n \"acc_stderr\": 0.01111698339239266\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6398786959818044,\n \"acc_stderr\": 0.01322255942325049\n }\n}\n```", "repo_url": "https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T16-55-23.006879.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["**/details_harness|winogrande|5_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T16-55-23.006879.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T16_55_23.006879", "path": ["results_2024-01-13T16-55-23.006879.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T16-55-23.006879.parquet"]}]}]} | 2024-01-13T16:58:01+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of SanjiWatsuki/Kunoichi-DPO-7B
Dataset automatically created during the evaluation run of model SanjiWatsuki/Kunoichi-DPO-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T16:55:23.006879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of SanjiWatsuki/Kunoichi-DPO-7B\n\n\n\nDataset automatically created during the evaluation run of model SanjiWatsuki/Kunoichi-DPO-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:55:23.006879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SanjiWatsuki/Kunoichi-DPO-7B\n\n\n\nDataset automatically created during the evaluation run of model SanjiWatsuki/Kunoichi-DPO-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T16:55:23.006879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0464595d2cf2e1aa5c8e67cbb37336812f34bf53 |
# Dataset Card for Evaluation run of alnrg2arg/test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/test](https://huggingface.co/alnrg2arg/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:13:28.432807](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test/blob/main/results_2024-01-13T17-13-28.432807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23383463507765326,
"acc_stderr": 0.03001653266955312,
"acc_norm": 0.2331114809676361,
"acc_norm_stderr": 0.030796413035811515,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22866894197952217,
"acc_stderr": 0.012272853582540788,
"acc_norm": 0.23037542662116042,
"acc_norm_stderr": 0.01230492841874761
},
"harness|hellaswag|10": {
"acc": 0.2529376618203545,
"acc_stderr": 0.004338071318912311,
"acc_norm": 0.2523401712806214,
"acc_norm_stderr": 0.004334676952703861
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031708,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031708
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560486,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17880794701986755,
"acc_stderr": 0.031287448506007245,
"acc_norm": 0.17880794701986755,
"acc_norm_stderr": 0.031287448506007245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.1553398058252427,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.1553398058252427,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.015491088951494586,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.015491088951494586
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290392,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290392
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676653,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676653
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.20915032679738563,
"acc_stderr": 0.016453399332279326,
"acc_norm": 0.20915032679738563,
"acc_norm_stderr": 0.016453399332279326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.03446296217088426,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.03446296217088426
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.5114443567482242,
"acc_stderr": 0.014048804199859325
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alnrg2arg__test | [
"region:us"
] | 2024-01-13T17:15:44+00:00 | {"pretty_name": "Evaluation run of alnrg2arg/test", "dataset_summary": "Dataset automatically created during the evaluation run of model [alnrg2arg/test](https://huggingface.co/alnrg2arg/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:13:28.432807](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test/blob/main/results_2024-01-13T17-13-28.432807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23383463507765326,\n \"acc_stderr\": 0.03001653266955312,\n \"acc_norm\": 0.2331114809676361,\n \"acc_norm_stderr\": 0.030796413035811515,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22866894197952217,\n \"acc_stderr\": 0.012272853582540788,\n \"acc_norm\": 0.23037542662116042,\n \"acc_norm_stderr\": 0.01230492841874761\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2529376618203545,\n \"acc_stderr\": 0.004338071318912311,\n \"acc_norm\": 0.2523401712806214,\n \"acc_norm_stderr\": 0.004334676952703861\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031708,\n \"acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031708\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n \"acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198906,\n \"acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198906\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560486,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560486\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.17880794701986755,\n \"acc_stderr\": 0.031287448506007245,\n \"acc_norm\": 0.17880794701986755,\n \"acc_norm_stderr\": 0.031287448506007245\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n \"acc_stderr\": 0.015491088951494586,\n \"acc_norm\": 0.2503192848020434,\n \"acc_norm_stderr\": 0.015491088951494586\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290392,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290392\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676653,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676653\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.20915032679738563,\n \"acc_stderr\": 0.016453399332279326,\n \"acc_norm\": 0.20915032679738563,\n \"acc_norm_stderr\": 0.016453399332279326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.03446296217088426,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.03446296217088426\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5114443567482242,\n \"acc_stderr\": 0.014048804199859325\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/alnrg2arg/test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-13-28.432807.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["**/details_harness|winogrande|5_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-13-28.432807.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_13_28.432807", "path": ["results_2024-01-13T17-13-28.432807.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-13-28.432807.parquet"]}]}]} | 2024-01-13T17:16:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alnrg2arg/test
Dataset automatically created during the evaluation run of model alnrg2arg/test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T17:13:28.432807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alnrg2arg/test\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:13:28.432807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alnrg2arg/test\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:13:28.432807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4fb41549af2d54a61b09311863d20f264ecd7d59 |
# Dataset of myrrh (Fire Emblem)
This is the dataset of myrrh (Fire Emblem), containing 247 images and their tags.
The core tags of this character are `purple_hair, twintails, wings, dragon_wings, red_eyes, multi-tied_hair, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 247 | 335.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 247 | 190.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 574 | 395.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 247 | 298.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 574 | 549.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/myrrh_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_background, closed_mouth, simple_background, smile, dress, dragon_girl |
| 1 | 5 |  |  |  |  |  | 1girl, dress, sandals, simple_background, solo, white_background, wristband, dragon_girl, full_body, looking_at_viewer, closed_mouth, own_hands_together |
| 2 | 31 |  |  |  |  |  | 1girl, solo, long_sleeves, fake_animal_ears, halloween_costume, bat_ears, fur_trim, dress, simple_background, open_mouth, white_background |
| 3 | 6 |  |  |  |  |  | 1girl, blush, nipples, nude, pussy, small_breasts, solo, navel, loli, spread_legs |
| 4 | 5 |  |  |  |  |  | 1girl, hetero, navel, nipples, open_mouth, small_breasts, solo_focus, blush, mosaic_censoring, sex, vaginal, 1boy, loli, nude, pussy, spread_legs, tears, 3boys, dragon_girl, multiple_penises, panties_around_one_leg |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_background | closed_mouth | simple_background | smile | dress | dragon_girl | sandals | wristband | full_body | own_hands_together | long_sleeves | fake_animal_ears | halloween_costume | bat_ears | fur_trim | open_mouth | blush | nipples | nude | pussy | small_breasts | navel | loli | spread_legs | hetero | solo_focus | mosaic_censoring | sex | vaginal | 1boy | tears | 3boys | multiple_penises | panties_around_one_leg |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:---------------|:--------------------|:--------|:--------|:--------------|:----------|:------------|:------------|:---------------------|:---------------|:-------------------|:--------------------|:-----------|:-----------|:-------------|:--------|:----------|:-------|:--------|:----------------|:--------|:-------|:--------------|:---------|:-------------|:-------------------|:------|:----------|:-------|:--------|:--------|:-------------------|:-------------------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 31 |  |  |  |  |  | X | | X | X | | X | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/myrrh_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T16:51:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:44:59+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of myrrh (Fire Emblem)
==============================
This is the dataset of myrrh (Fire Emblem), containing 247 images and their tags.
The core tags of this character are 'purple\_hair, twintails, wings, dragon\_wings, red\_eyes, multi-tied\_hair, long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2bca7ae32301a14f010d85a1a22a427496d52f22 |
This is a subset of the Natural Questions Dataset for Long Form questions answering task. | arnaik/natural_questions_truncated | [
"task_categories:question-answering",
"size_categories:10K<n<100K",
"license:mit",
"region:us"
] | 2024-01-17T16:58:02+00:00 | {"license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"]} | 2024-01-19T16:00:33+00:00 | [] | [] | TAGS
#task_categories-question-answering #size_categories-10K<n<100K #license-mit #region-us
|
This is a subset of the Natural Questions Dataset for Long Form questions answering task. | [] | [
"TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #license-mit #region-us \n"
] |
d600c232cee7ae28d32967013c738d1e96942ef1 |
# Dataset Card for the SMNLI-MT
The SMNLI-MT datasets are machine-translated versions of the Stanford NLI and MultiNLI datasets in Maltese.
## Dataset Details
### Dataset Description
The datasets were translated using the Google Cloud Translate as part of the initial exploration of NLI in the Maltese language.
- **Curated by:** Matthew Darmanin
- **Language(s) (NLP):** Maltese
- **License:** CC 4.0
## Dataset Structure
The datasets are in the form of CSV files, delimited by semi-colons (;) and encoded in UTF-8.
Each dataset contains the following columns:
- **id** - the index of the sentence pair
- **premise** - the premise sentence
- **hypothesis** - the hypothesis sentence
- **gold_label** - the consensus label assigned in the original datasets; either _entailment_, _contradiction_ or _neutral_
## Dataset Card Contact
E-mail: <[email protected]> | darmanin-matt/smnli_mt | [
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:mt",
"license:cc-by-sa-4.0",
"nlp",
"nli",
"rte",
"maltese",
"malti",
"region:us"
] | 2024-01-17T17:02:16+00:00 | {"language": ["mt"], "license": "cc-by-sa-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-classification"], "tags": ["nlp", "nli", "rte", "maltese", "malti"], "configs": [{"config_name": "snli", "data_files": "snli_mt.csv", "sep": ";"}, {"config_name": "mnli", "data_files": "mnli_mt.csv", "sep": ";"}]} | 2024-01-17T17:42:21+00:00 | [] | [
"mt"
] | TAGS
#task_categories-text-classification #size_categories-100K<n<1M #language-Maltese #license-cc-by-sa-4.0 #nlp #nli #rte #maltese #malti #region-us
|
# Dataset Card for the SMNLI-MT
The SMNLI-MT datasets are machine-translated versions of the Stanford NLI and MultiNLI datasets in Maltese.
## Dataset Details
### Dataset Description
The datasets were translated using the Google Cloud Translate as part of the initial exploration of NLI in the Maltese language.
- Curated by: Matthew Darmanin
- Language(s) (NLP): Maltese
- License: CC 4.0
## Dataset Structure
The datasets are in the form of CSV files, delimited by semi-colons (;) and encoded in UTF-8.
Each dataset contains the following columns:
- id - the index of the sentence pair
- premise - the premise sentence
- hypothesis - the hypothesis sentence
- gold_label - the consensus label assigned in the original datasets; either _entailment_, _contradiction_ or _neutral_
## Dataset Card Contact
E-mail: <URL@URL> | [
"# Dataset Card for the SMNLI-MT\n\nThe SMNLI-MT datasets are machine-translated versions of the Stanford NLI and MultiNLI datasets in Maltese.",
"## Dataset Details",
"### Dataset Description\n\nThe datasets were translated using the Google Cloud Translate as part of the initial exploration of NLI in the Maltese language.\n\n\n- Curated by: Matthew Darmanin\n- Language(s) (NLP): Maltese\n- License: CC 4.0",
"## Dataset Structure\n\nThe datasets are in the form of CSV files, delimited by semi-colons (;) and encoded in UTF-8.\n\nEach dataset contains the following columns:\n - id - the index of the sentence pair\n - premise - the premise sentence\n - hypothesis - the hypothesis sentence\n - gold_label - the consensus label assigned in the original datasets; either _entailment_, _contradiction_ or _neutral_",
"## Dataset Card Contact\n\nE-mail: <URL@URL>"
] | [
"TAGS\n#task_categories-text-classification #size_categories-100K<n<1M #language-Maltese #license-cc-by-sa-4.0 #nlp #nli #rte #maltese #malti #region-us \n",
"# Dataset Card for the SMNLI-MT\n\nThe SMNLI-MT datasets are machine-translated versions of the Stanford NLI and MultiNLI datasets in Maltese.",
"## Dataset Details",
"### Dataset Description\n\nThe datasets were translated using the Google Cloud Translate as part of the initial exploration of NLI in the Maltese language.\n\n\n- Curated by: Matthew Darmanin\n- Language(s) (NLP): Maltese\n- License: CC 4.0",
"## Dataset Structure\n\nThe datasets are in the form of CSV files, delimited by semi-colons (;) and encoded in UTF-8.\n\nEach dataset contains the following columns:\n - id - the index of the sentence pair\n - premise - the premise sentence\n - hypothesis - the hypothesis sentence\n - gold_label - the consensus label assigned in the original datasets; either _entailment_, _contradiction_ or _neutral_",
"## Dataset Card Contact\n\nE-mail: <URL@URL>"
] |
539c681f895c1f85c4883940ed9dc4cf0cef5c57 |
# Dataset of fa (Fire Emblem)
This is the dataset of fa (Fire Emblem), containing 152 images and their tags.
The core tags of this character are `short_hair, pointy_ears, facial_mark, purple_hair, green_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 152 | 122.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fa_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 152 | 85.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fa_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 269 | 152.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fa_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 152 | 114.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fa_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 269 | 193.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fa_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fa_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, forehead_mark, long_sleeves, solo, open_mouth, simple_background, white_background, cape, dress, full_body, boots, bow, bag, closed_eyes |
| 1 | 35 |  |  |  |  |  | forehead_mark, fur_trim, long_sleeves, 1girl, cape, reindeer_antlers, solo, dress, brown_gloves, open_mouth, bell, boots, bow, full_body, simple_background |
| 2 | 10 |  |  |  |  |  | 1girl, forehead_mark, rabbit_ears, fake_animal_ears, white_gloves, easter_egg, hair_flower, solo, basket, open_mouth, pantyhose, see-through, simple_background, holding, smile, white_background, bow, full_body, leotard |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | forehead_mark | long_sleeves | solo | open_mouth | simple_background | white_background | cape | dress | full_body | boots | bow | bag | closed_eyes | fur_trim | reindeer_antlers | brown_gloves | bell | rabbit_ears | fake_animal_ears | white_gloves | easter_egg | hair_flower | basket | pantyhose | see-through | holding | smile | leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:---------------|:-------|:-------------|:--------------------|:-------------------|:-------|:--------|:------------|:--------|:------|:------|:--------------|:-----------|:-------------------|:---------------|:-------|:--------------|:-------------------|:---------------|:-------------|:--------------|:---------|:------------|:--------------|:----------|:--------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 35 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | | | X | X | X | X | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | | X | X | X | X | | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/fa_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T17:03:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:30:32+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fa (Fire Emblem)
===========================
This is the dataset of fa (Fire Emblem), containing 152 images and their tags.
The core tags of this character are 'short\_hair, pointy\_ears, facial\_mark, purple\_hair, green\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
f20885f3924b86cbb252d5652cebe66be5bb9fe8 |
# Dataset of lilina (Fire Emblem)
This is the dataset of lilina (Fire Emblem), containing 392 images and their tags.
The core tags of this character are `blue_hair, long_hair, blue_eyes, hat, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 392 | 421.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 392 | 270.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 813 | 520.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 392 | 387.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 813 | 690.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilina_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lilina_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, dress, looking_at_viewer, simple_background, solo, smile, white_background, blush, open_mouth, book, gloves, jewelry |
| 1 | 5 |  |  |  |  |  | 1girl, red_capelet, red_headwear, shiny_hair, simple_background, solo, upper_body, bangs, dress, hair_between_eyes, looking_at_viewer, white_background, blush, jewelry, :d, open_mouth |
| 2 | 9 |  |  |  |  |  | 1boy, 1girl, smile, dress, open_mouth, simple_background, couple, gloves, armor, blush, cape, closed_eyes, hetero, red_hair, white_background, jewelry, short_hair |
| 3 | 10 |  |  |  |  |  | bridal_veil, smile, wedding_dress, 1girl, bride, official_alternate_costume, solo, white_dress, cleavage, looking_at_viewer, simple_background, white_background, elbow_gloves, hair_flower, open_mouth, white_gloves, upper_body, white_flower, blush, holding_bouquet |
| 4 | 5 |  |  |  |  |  | 1girl, bangs, bouquet, bridal_veil, elbow_gloves, feather_trim, flower, full_body, holding, medium_breasts, solo, thigh_boots, thighhighs, wedding_dress, white_dress, white_footwear, white_gloves, bride, cleavage, detached_collar, gold_trim, hair_ornament, shiny_hair, blush, looking_away, open_mouth, simple_background, smile, white_background, closed_mouth, feathers, high_heel_boots, jewelry, looking_at_viewer, petals, standing, transparent_background |
| 5 | 21 |  |  |  |  |  | 1girl, red_bikini, hair_flower, navel, head_wreath, smile, open_mouth, solo, bangs, looking_at_viewer, official_alternate_costume, holding, blush, hibiscus, water, cloud, day, jewelry, sky, ocean, outdoors |
| 6 | 7 |  |  |  |  |  | 1girl, hetero, mosaic_censoring, nipples, penis, sex, solo_focus, vaginal, 1boy, blush, pantyhose, torn_clothes, cum_in_pussy, large_breasts, open_mouth, medium_breasts, navel, straddling, topless |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | looking_at_viewer | simple_background | solo | smile | white_background | blush | open_mouth | book | gloves | jewelry | red_capelet | red_headwear | shiny_hair | upper_body | bangs | hair_between_eyes | :d | 1boy | couple | armor | cape | closed_eyes | hetero | red_hair | short_hair | bridal_veil | wedding_dress | bride | official_alternate_costume | white_dress | cleavage | elbow_gloves | hair_flower | white_gloves | white_flower | holding_bouquet | bouquet | feather_trim | flower | full_body | holding | medium_breasts | thigh_boots | thighhighs | white_footwear | detached_collar | gold_trim | hair_ornament | looking_away | closed_mouth | feathers | high_heel_boots | petals | standing | transparent_background | red_bikini | navel | head_wreath | hibiscus | water | cloud | day | sky | ocean | outdoors | mosaic_censoring | nipples | penis | sex | solo_focus | vaginal | pantyhose | torn_clothes | cum_in_pussy | large_breasts | straddling | topless |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:--------------------|:-------|:--------|:-------------------|:--------|:-------------|:-------|:---------|:----------|:--------------|:---------------|:-------------|:-------------|:--------|:--------------------|:-----|:-------|:---------|:--------|:-------|:--------------|:---------|:-----------|:-------------|:--------------|:----------------|:--------|:-----------------------------|:--------------|:-----------|:---------------|:--------------|:---------------|:---------------|:------------------|:----------|:---------------|:---------|:------------|:----------|:-----------------|:--------------|:-------------|:-----------------|:------------------|:------------|:----------------|:---------------|:---------------|:-----------|:------------------|:---------|:-----------|:-------------------------|:-------------|:--------|:--------------|:-----------|:--------|:--------|:------|:------|:--------|:-----------|:-------------------|:----------|:--------|:------|:-------------|:----------|:------------|:---------------|:---------------|:----------------|:-------------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | X | | X | X | X | X | | X | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | X | X | X | X | X | X | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | X | X | X | X | X | | | X | | | X | | X | | | | | | | | | | | X | X | X | | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 21 |  |  |  |  |  | X | | X | | X | X | | X | X | | | X | | | | | X | | | | | | | | | | | | | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | | | | | X | X | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/lilina_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T17:03:23+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T18:22:51+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of lilina (Fire Emblem)
===============================
This is the dataset of lilina (Fire Emblem), containing 392 images and their tags.
The core tags of this character are 'blue\_hair, long\_hair, blue\_eyes, hat, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
d0f591f9019c10b07ce433ad4f7ff8675351ad86 |
# Dataset of nn (Fire Emblem)
This is the dataset of nn (Fire Emblem), containing 92 images and their tags.
The core tags of this character are `green_hair, pointy_ears, long_hair, braid, ahoge, purple_eyes, twin_braids, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 92 | 93.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 92 | 58.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 200 | 120.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 92 | 83.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 200 | 159.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nn_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, cape, solo, gloves, garter_straps, thighhighs, boots, smile |
| 1 | 8 |  |  |  |  |  | 1girl, solo, thigh_boots, thighhighs, frills, full_body, garter_straps, red_footwear, short_dress, zettai_ryouiki, red_cape, simple_background, white_background, brown_gloves, open_mouth, white_dress, bag, arm_up, jewelry, long_sleeves, looking_at_viewer, ribbon, shiny_hair, smile |
| 2 | 10 |  |  |  |  |  | solo_focus, nipples, 1boy, 1girl, completely_nude, hetero, open_mouth, blush, pussy, smile, loli, navel, small_breasts, 2girls, heart, looking_at_viewer, penis, sex_from_behind |
| 3 | 6 |  |  |  |  |  | open_mouth, hair_flower, looking_at_viewer, rabbit_ears, 1girl, fake_animal_ears, official_alternate_costume, pink_gloves, solo, animal_hat, bunny_hat, dress, pantyhose, skirt, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | solo | gloves | garter_straps | thighhighs | boots | smile | thigh_boots | frills | full_body | red_footwear | short_dress | zettai_ryouiki | red_cape | simple_background | white_background | brown_gloves | open_mouth | white_dress | bag | arm_up | jewelry | long_sleeves | looking_at_viewer | ribbon | shiny_hair | solo_focus | nipples | 1boy | completely_nude | hetero | blush | pussy | loli | navel | small_breasts | 2girls | heart | penis | sex_from_behind | hair_flower | rabbit_ears | fake_animal_ears | official_alternate_costume | pink_gloves | animal_hat | bunny_hat | dress | pantyhose | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------|:---------|:----------------|:-------------|:--------|:--------|:--------------|:---------|:------------|:---------------|:--------------|:-----------------|:-----------|:--------------------|:-------------------|:---------------|:-------------|:--------------|:------|:---------|:----------|:---------------|:--------------------|:---------|:-------------|:-------------|:----------|:-------|:------------------|:---------|:--------|:--------|:-------|:--------|:----------------|:---------|:--------|:--------|:------------------|:--------------|:--------------|:-------------------|:-----------------------------|:--------------|:-------------|:------------|:--------|:------------|:--------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | | | | | | X | | | | | | | | | | | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | | | | X | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nn_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T17:03:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:26:05+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of nn (Fire Emblem)
===========================
This is the dataset of nn (Fire Emblem), containing 92 images and their tags.
The core tags of this character are 'green\_hair, pointy\_ears, long\_hair, braid, ahoge, purple\_eyes, twin\_braids, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
dc4c5f9f36e35777a75d252447aa7f009af87168 |
# Dataset of lethe (Fire Emblem)
This is the dataset of lethe (Fire Emblem), containing 175 images and their tags.
The core tags of this character are `animal_ears, cat_ears, cat_girl, purple_eyes, orange_hair, facial_mark, short_hair, tail, cat_tail, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 175 | 188.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 175 | 117.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 372 | 225.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 175 | 169.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 372 | 305.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lethe_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 57 |  |  |  |  |  | whisker_markings, 1girl, solo, choker, brown_belt, green_shorts, side_slit_shorts, looking_at_viewer, simple_background, bell, wrist_wrap, gloves, thigh_strap, bandaged_arm, white_background |
| 1 | 6 |  |  |  |  |  | blonde_hair, 1girl, blush, medium_breasts, nipples, nude, solo, choker, simple_background, whisker_markings, looking_at_viewer, navel, white_background |
| 2 | 6 |  |  |  |  |  | 1girl, cleavage_cutout, large_breasts, looking_at_viewer, simple_background, whisker_markings, black_bra, blush, cat_cutout, cat_lingerie, jingle_bell, navel, neck_bell, solo, bangs, black_panties, cat_ear_panties, frilled_bra, green_choker, side-tie_panties, white_background |
| 3 | 11 |  |  |  |  |  | 1girl, hetero, penis, solo_focus, 1boy, blush, large_breasts, nipples, whisker_markings, choker, mosaic_censoring, nude, open_mouth, uncensored, cum_on_breasts, facial |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, blush, girl_on_top, hetero, sex, solo_focus, vaginal, cowgirl_position, looking_at_viewer, medium_breasts, nipples, open_mouth, penis, pussy, whisker_markings, bar_censor, pov, bandaged_arm, choker, navel, nude, outdoors, pubic_hair, smile, sweat |
| 5 | 8 |  |  |  |  |  | kimono, whisker_markings, 1girl, bangs, hairband, solo, looking_at_viewer, official_alternate_costume, open_mouth, skirt, hair_ornament, smile, blonde_hair, closed_mouth, fingernails, full_body, green_hakama, jingle_bell, rope, sandals, sash, shiny_hair, simple_background, slit_pupils, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | whisker_markings | 1girl | solo | choker | brown_belt | green_shorts | side_slit_shorts | looking_at_viewer | simple_background | bell | wrist_wrap | gloves | thigh_strap | bandaged_arm | white_background | blonde_hair | blush | medium_breasts | nipples | nude | navel | cleavage_cutout | large_breasts | black_bra | cat_cutout | cat_lingerie | jingle_bell | neck_bell | bangs | black_panties | cat_ear_panties | frilled_bra | green_choker | side-tie_panties | hetero | penis | solo_focus | 1boy | mosaic_censoring | open_mouth | uncensored | cum_on_breasts | facial | girl_on_top | sex | vaginal | cowgirl_position | pussy | bar_censor | pov | outdoors | pubic_hair | smile | sweat | kimono | hairband | official_alternate_costume | skirt | hair_ornament | closed_mouth | fingernails | full_body | green_hakama | rope | sandals | sash | shiny_hair | slit_pupils | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------|:--------|:-------|:---------|:-------------|:---------------|:-------------------|:--------------------|:--------------------|:-------|:-------------|:---------|:--------------|:---------------|:-------------------|:--------------|:--------|:-----------------|:----------|:-------|:--------|:------------------|:----------------|:------------|:-------------|:---------------|:--------------|:------------|:--------|:----------------|:------------------|:--------------|:---------------|:-------------------|:---------|:--------|:-------------|:-------|:-------------------|:-------------|:-------------|:-----------------|:---------|:--------------|:------|:----------|:-------------------|:--------|:-------------|:------|:-----------|:-------------|:--------|:--------|:---------|:-----------|:-----------------------------|:--------|:----------------|:---------------|:--------------|:------------|:---------------|:-------|:----------|:-------|:-------------|:--------------|:-----------|
| 0 | 57 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | X | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | | | | X | X | | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | X | | X | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | | | | X | | | | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | X | | | | | X | X | | | | | | | X | | | | | | | | | | | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/lethe_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T17:03:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:40:02+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of lethe (Fire Emblem)
==============================
This is the dataset of lethe (Fire Emblem), containing 175 images and their tags.
The core tags of this character are 'animal\_ears, cat\_ears, cat\_girl, purple\_eyes, orange\_hair, facial\_mark, short\_hair, tail, cat\_tail, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
c73539b67e69565e9c175df375995a8b09475910 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | adonaivera/crowdsourced-calculator-demo | [
"region:us"
] | 2024-01-17T17:14:41+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data.csv"}]}]} | 2024-01-17T17:15:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5b81b4f9f31894312298de0466fd23a7caeeb555 | # Dataset Card for "ForwardScreeningFS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Lollitor/ForwardScreeningFS | [
"region:us"
] | 2024-01-17T17:15:22+00:00 | {"dataset_info": {"features": [{"name": "#code", "dtype": "string"}, {"name": "inputs", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 16415601, "num_examples": 16245}], "download_size": 1813557, "dataset_size": 16415601}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T17:15:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ForwardScreeningFS"
More Information needed | [
"# Dataset Card for \"ForwardScreeningFS\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ForwardScreeningFS\"\n\nMore Information needed"
] |
8d54765982adc75d9f9cb572e0f45bc09ddb3156 | # Terra 4M Training Log
Terra 4M is a 4.2 million parameter, purely convolutional diffusion model for terrain generation.
This is a log of all the checkpoints and images generated throughout training.
Note that the checkpoint corresponding to step 3,982,014 was selected as the final model.
The training code can be found [here](https://github.com/novaia-computing/ntg/).

 | novaia/terra-4m-training-log | [
"license:apache-2.0",
"region:us"
] | 2024-01-17T17:20:13+00:00 | {"license": "apache-2.0"} | 2024-01-19T01:25:33+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| # Terra 4M Training Log
Terra 4M is a 4.2 million parameter, purely convolutional diffusion model for terrain generation.
This is a log of all the checkpoints and images generated throughout training.
Note that the checkpoint corresponding to step 3,982,014 was selected as the final model.
The training code can be found here.
!image/png
!image/png | [
"# Terra 4M Training Log\nTerra 4M is a 4.2 million parameter, purely convolutional diffusion model for terrain generation. \nThis is a log of all the checkpoints and images generated throughout training. \nNote that the checkpoint corresponding to step 3,982,014 was selected as the final model.\nThe training code can be found here.\n\n!image/png\n!image/png"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Terra 4M Training Log\nTerra 4M is a 4.2 million parameter, purely convolutional diffusion model for terrain generation. \nThis is a log of all the checkpoints and images generated throughout training. \nNote that the checkpoint corresponding to step 3,982,014 was selected as the final model.\nThe training code can be found here.\n\n!image/png\n!image/png"
] |
346ed3450c81974862e43c3bbc1d63af16868b27 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | adonaivera/ofwat_cleaner_loopx | [
"region:us"
] | 2024-01-17T17:22:45+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data.csv"}]}]} | 2024-01-17T17:22:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
38dcb94a3072975000d7013ce203480c29821584 |
# Dataset of ash (Fire Emblem)
This is the dataset of ash (Fire Emblem), containing 69 images and their tags.
The core tags of this character are `dark-skinned_female, dark_skin, animal_ears, cow_horns, horns, cow_ears, long_hair, breasts, cow_girl, white_hair, bangs, black_horns, tail, brown_eyes, cow_tail, ear_piercing, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 69 | 109.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ash_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 69 | 56.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ash_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 175 | 122.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ash_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 69 | 93.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ash_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 175 | 178.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ash_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ash_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------|
| 0 | 39 |  |  |  |  |  | 1girl, solo, looking_at_viewer, bare_shoulders, piercing, jewelry, smile, blush, dress, simple_background, holding, staff |
| 1 | 6 |  |  |  |  |  | 1girl, open_mouth, solo, hagoita, earrings, fur-trimmed_kimono, hair_ornament, sandals, tabi, full_body, smile, white_kimono |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | bare_shoulders | piercing | jewelry | smile | blush | dress | simple_background | holding | staff | open_mouth | hagoita | earrings | fur-trimmed_kimono | hair_ornament | sandals | tabi | full_body | white_kimono |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------------|:-----------|:----------|:--------|:--------|:--------|:--------------------|:----------|:--------|:-------------|:----------|:-----------|:---------------------|:----------------|:----------|:-------|:------------|:---------------|
| 0 | 39 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | | | X | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/ash_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T17:27:08+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:41:59+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ash (Fire Emblem)
============================
This is the dataset of ash (Fire Emblem), containing 69 images and their tags.
The core tags of this character are 'dark-skinned\_female, dark\_skin, animal\_ears, cow\_horns, horns, cow\_ears, long\_hair, breasts, cow\_girl, white\_hair, bangs, black\_horns, tail, brown\_eyes, cow\_tail, ear\_piercing, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1e4ef2be230889cf429ca5b6fad5f5d9e701dd60 |
# Dataset of shez (Fire Emblem)
This is the dataset of shez (Fire Emblem), containing 393 images and their tags.
The core tags of this character are `purple_hair, hair_over_one_eye, purple_eyes, long_hair, breasts, bangs, hair_bun, large_breasts, single_hair_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 393 | 589.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shez_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 393 | 317.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shez_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 926 | 660.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shez_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 393 | 513.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shez_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 926 | 979.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shez_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shez_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, armor, asymmetrical_clothes, cape, choker, cleavage, closed_mouth, gloves, looking_at_viewer, simple_background, smile, solo |
| 1 | 7 |  |  |  |  |  | 1girl, armor, asymmetrical_clothes, cape, choker, cleavage, closed_mouth, gloves, looking_at_viewer, simple_background, solo |
| 2 | 5 |  |  |  |  |  | 1girl, armor, asymmetrical_clothes, cape, choker, cleavage, closed_mouth, looking_at_viewer, simple_background, smile, solo |
| 3 | 10 |  |  |  |  |  | 1girl, armor, asymmetrical_clothes, cape, choker, cleavage, closed_mouth, looking_at_viewer, simple_background, solo |
| 4 | 14 |  |  |  |  |  | 1girl, armor, asymmetrical_clothes, cape, choker, cleavage, closed_mouth, looking_at_viewer, simple_background, solo, sword, gloves, holding |
| 5 | 6 |  |  |  |  |  | 1girl, armor, asymmetrical_clothes, cape, choker, cleavage, closed_mouth, dual_wielding, looking_at_viewer, simple_background, solo, gloves, holding_sword |
| 6 | 5 |  |  |  |  |  | 2girls, asymmetrical_clothes, cape, choker, cleavage, closed_mouth, simple_background, armor, gloves, medium_hair, smile, looking_at_viewer, closed_eyes |
| 7 | 5 |  |  |  |  |  | 1girl, black_bikini, choker, cleavage, hair_flower, navel, official_alternate_costume, smile, solo, closed_mouth, looking_at_viewer, simple_background, bare_shoulders, white_background |
| 8 | 6 |  |  |  |  |  | 1girl, black_bikini, choker, cleavage, fingerless_gloves, hair_flower, looking_at_viewer, navel, official_alternate_costume, solo, smile |
| 9 | 11 |  |  |  |  |  | 1girl, hetero, nipples, penis, choker, sex, 1boy, armor, asymmetrical_clothes, blush, cum_in_pussy, spread_legs, vaginal, looking_at_viewer, mosaic_censoring, gloves, rape, breasts_out, cleavage, cum_on_breasts, cum_on_hair, facial, navel, open_mouth, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armor | asymmetrical_clothes | cape | choker | cleavage | closed_mouth | gloves | looking_at_viewer | simple_background | smile | solo | sword | holding | dual_wielding | holding_sword | 2girls | medium_hair | closed_eyes | black_bikini | hair_flower | navel | official_alternate_costume | bare_shoulders | white_background | fingerless_gloves | hetero | nipples | penis | sex | 1boy | blush | cum_in_pussy | spread_legs | vaginal | mosaic_censoring | rape | breasts_out | cum_on_breasts | cum_on_hair | facial | open_mouth | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------------|:-------|:---------|:-----------|:---------------|:---------|:--------------------|:--------------------|:--------|:-------|:--------|:----------|:----------------|:----------------|:---------|:--------------|:--------------|:---------------|:--------------|:--------|:-----------------------------|:-----------------|:-------------------|:--------------------|:---------|:----------|:--------|:------|:-------|:--------|:---------------|:--------------|:----------|:-------------------|:-------|:--------------|:-----------------|:--------------|:---------|:-------------|:-------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | | X | X | X | X | X | X | X | X | X | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | X | X | X | | X | X | X | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | X | X | | | X | | X | X | | | | | | | | X | X | X | X | | | X | | | | | | | | | | | | | | | | | |
| 9 | 11 |  |  |  |  |  | X | X | X | | X | X | | X | X | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/shez_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T17:36:36+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T19:18:09+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of shez (Fire Emblem)
=============================
This is the dataset of shez (Fire Emblem), containing 393 images and their tags.
The core tags of this character are 'purple\_hair, hair\_over\_one\_eye, purple\_eyes, long\_hair, breasts, bangs, hair\_bun, large\_breasts, single\_hair\_bun', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.